Job Description
Description
Job Description:
- Develop and maintain backend systems using Python, PySpark, ensuring high performance, scalability, and reliability.
- Participate in the design and implementation of data engineering and ETL pipelines using PySpark.
- Collaborate with cross-functional teams to identify and prioritize project requirements.
- Mentor and guide engineers, providing technical guidance and code reviews.
- Stay up-to-date with the latest technologies and frameworks, and apply this knowledge to improve existing systems and processes.
- Lead the technical direction of the team, including architecture, design, and implementation of software systems.
- Collaborate with product owners to define and prioritize requirements.
- Develop and maintain technical documentation, including architecture diagrams and technical specifications.
- Participate in code reviews, ensuring high-quality code and adherence to coding standards.
- Collaborate with DevOps and Operations teams to ensure smooth deployment and operation of applications.
- Expert in DevOps practices and tools for CI/CD pipelines.
- Willingness to learn new technologies and adapt to new challenges.
- Proven experience in leading technical teams, including technical guidance, mentoring, and coaching.
- Strong technical leadership skills, including the ability to make technical decisions, prioritize tasks, and manage technical resources.
- Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams.
- Ability to drive technical innovation, including researching new technologies, evaluating technical options, and recommending technical solutions.
- Strong problem-solving skills, with the ability to debug complex issues, optimize system performance, and ensure high-quality software delivery.
- Experience with Agile development methodologies, including Scrum or Kanban, and the ability to apply these principles to lead the team.
- Experience with cloud platforms, including Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP).
- Experience with PySpark for Data Engineering/ETL pipelines.
- Familiarity with containerization using Docker and orchestration using Kubernetes.
#J-18808-Ljbffr
Company
Cynet Systems Inc
Location
Toronto
Country
Canada
Salary
100.000
URL