Description

The Role Tactable is seeking an experienced Senior Data Engineer with strong expertise in Databricks to own and further develop our data infrastructure, ingestion, and to help us build our data products. You’ll work independently while collaborating closely with both engineering and other data teams to build robust, scalable data systems. As a professional with a deep understanding of Java, Python, Databricks, you’ll take the lead in building data pipelines used to ingest the information that fuels models and trading systems. The ideal candidate will have experience working in an evolving startup environment. You will be an in the moment problem solver with the ability to think about the short term and long term plan. You’re energized about building and scaling and being part of a forward-thinking organization. We are building an incredible company and looking for talented, energetic, and motivated people to join our team. You can learn more about our Company, Culture and Values here: https://www.tactable.io/careers . Responsibilities: Design, build, and maintain highly scalable and robust data pipeline architectures leveraging frameworks such as Apache Spark, Hadoop, and cloud-based services (e.g., AWS, GCP, Azure). Using data warehouse solutions like Snowflake, Redshift, or BigQuery, aggregate and integrate large, complex data sets to meet both functional and non-functional requirements. Develop and implement process-improvement strategies, including automating manual tasks, optimizing data delivery, and restructuring infrastructure for scalability using workflow orchestration tools such as Apache Airflow or Luigi. Collaborate with stakeholders (executive, product, data, and design teams) to address data infrastructure and reporting needs, integrating DevOps practices (Docker, Kubernetes, Terraform) for smooth deployment. Create data tools for analytics and data science teams, helping them build and optimize data-driven solutions using Python and SQL. Implement automated testing frameworks, data validation, and compliance measures to ensure data quality, integrity, and security across the data architecture. Provide mentorship and technical expertise to junior engineers and analysts, guiding them in best practices for data engineering, performance optimization, and secure data handling. Monitor and optimize data processing performance and resource utilization using relevant metrics and logging/monitoring tools (e.g., Prometheus, Grafana, CloudWatch). Requirements: 8 years of professional experience as a Java Engineer, with at least 4 years in a Data Engineering or similar role Strong proficiency in Java and related frameworks (Spring, Hibernate, Spring Boot) Hands-on experience with Apache Airflow or Databricks Proven track record in designing and implementing large-scale data pipelines and datasets In-depth knowledge of SQL and NoSQL databases Familiarity with cloud platforms (AWS, Azure, GCP) Experience creating RESTful APIs and working with microservices architectures Proficiency in standard DevOps tools (Jenkins, Maven, Gradle, Git) and CI/CD pipelines Competency with automation and scripting (Bash, Python) Excellent analytical, problem-solving, and troubleshooting skills Strong communication skills, with the ability to interact effectively with stakeholders at various levels Self-directed, proactive, and adept at supporting multiple teams and systems simultaneously You must be eligible to work in Canada to be considered for this role Nice to Have: Experience in financial services, banking, insurance, or media sectors. Knowledge of containerization and orchestration tools such as Docker and Kubernetes. Experience with streaming technologies such as Apache Kafka or Spark Streaming. Familiarity with big data technologies, including Spark, Azure Data Factory, BigQuery, Snowflake. Understanding of Data Governance frameworks and best practices. Other Skills: Degree in Computer Science, Engineering, or equivalent industry experience Strong communication and teamwork skills Strong time management skills and ability to manage multiple workstreams What We Offer: Hybrid working model Comprehensive Health Benefits Generous holidays and flexible PTO Laptop/Equipment provided Potential for professional growth and advancement