The Job logo



Snowflake, Databricks or Apache Spark

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.



Roles and Responsibilites 

•    Managing the entire development lifecycle, from requirements gathering to deployment and maintenance
•    Developing and implementing analytics strategies and roadmaps
•    Collaborating with senior business leaders to align analytics initiatives with business goals
•    Identifying and evaluating new technologies and trends
•    Leading large-scale data engineering projects and managing cross-functional teams
•    Setting technical direction and driving innovation in data infrastructure and analytics solutions
•    Developing and implementing data architecture and modeling standards and best practices
•    Collaborating with executive leadership to align data infrastructure and analytics initiatives with business goals


Primary Skills

Strong experience in leading large-scale data projects
In-depth understanding of data governance, data architecture, and data modeling
Strong business acumen and ability to translate business requirements into technical solutions.
Extensive experience in data engineering and data architecture at scale
In-depth understanding of data security, privacy, and compliance requirements
Strong business acumen and ability to translate business requirements into technical solutions.
Experience with data storage and retrieval technologies, SQL and NoSQL
Experience with data processing frameworks such as Snowflake, Databricks or Apache Spark
"Bachelor's degree in Computer Science, Information Systems, Business Management (desirable)
or specialized training/certification (at least 1 year)
or equivalent work experience" (at least 10 years)

Typically requires 7-10 years’ of related technical experience

Secondary Skills
Proven track record in developing and implementing successful analytics strategies
Experience with modern data platforms and technologies such as serverless computing, containerization, and microservices
Experience with cloud-based data platforms such as AWS, GCP, or Azure


Primary Location

: India-Maharashtra-Pune

Experience Required (In Years): Minimum- 7 Maximum- 10

Set alert for similar jobsSnowflake, Databricks or Apache Spark role in Pune, India
Zensar Technologies Logo


Zensar Technologies

Job Posted

9 months ago

Job Type




Experience Level

3-7 years


Software Engineering


Pune, Maharashtra, India




Be an early applicant

Related Jobs

Zensar Technologies Logo

Web Developer (advanced SQL, C# and/or Python experience)

Zensar Technologies

Pune, Maharashtra, India

Posted: 9 months ago

Description   Job Description : We are seeking an experienced and highly skilled Advanced Web Developer to join our dynamic team. As an Advanced Web Developer, you will be responsible for creating and maintaining innovative web applications that deliver outstanding user experiences. You will work collaboratively with cross-functional teams to design, develop, and implement high-quality web solutions. Responsibilities: Develop and maintain complex web applications using modern technologies and best practices. Write efficient, reliable, and scalable code in languages such as C#, Python, and SQL. Design and implement database schemas, optimize queries, and manage data integration. Collaborate with UI/UX designers to create visually appealing and intuitive user interfaces. Ensure web applications are responsive, performant, and accessible across various devices and browsers. Troubleshoot and debug issues, and implement solutions to enhance application performance and functionality. Stay up-to-date with industry trends, emerging technologies, and best practices in web development. Mentor and assist junior developers in their professional growth. Requirements: Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience). Proven experience (X years) as a web developer, with a strong portfolio of web projects showcasing your skills. Advanced knowledge of SQL, with experience in designing and optimizing relational databases. Proficiency in at least one of the following programming languages: C# or Python. Familiarity with front-end technologies such as HTML, CSS, JavaScript, and popular frameworks (e.g., React, Angular, or Vue). Experience with version control systems (e.g., Git) and agile development methodologies. Solid understanding of web security best practices. Excellent problem-solving skills and the ability to work effectively in a collaborative team environment. Strong communication skills, both written and verbal.   Primary Location :  India-Maharashtra-Pune Job Posting :  Aug 23, 2023 Experience Required (In Years):   Minimum-  5  Maximum-  7

Zensar Technologies Logo

Azure Data Bricks

Zensar Technologies

Pune, Maharashtra, India

Posted: 9 months ago

What's this role about? As an Azure Technical Specialist, you will be responsible for migration to Azure Data Cloud or implementing data solutions using Azure Data Cloud services including integration with existing data and analytics platforms and tools.   Here's how you'll contribute: You will contribute to the role by: Creating Impact faster: Delivering more Impact In less time by quickly deploying solutions or augmenting existing ones to enable teams to accelerate business results and Increase time to value Breaking through barriers: Helping create better customer experience by taking the data first approach and deliver Insights Adapting to anything: Agility to reacting and responding to new business priorities and market conditions and customer opportunities with rapidly deployable solutions Innovating anywhere: Solving problems with powerful solutions enabling Inter operable solutions across multiple lines of business   Skills required to contribute: 7-9 Years of Data and Analytics experience with minimum 4 years in Azure Cloud Excellent communication and presentation skills. Extensive experience in Azure stack – Azure Data bricks, Azure Synapse, ADLS, Azure SQL DB, Azure Data Factory, CosmoDB, Analysis Services, Event Hub etc.. Excellent experience in data processing using Azure Data bricks, complex data transformation using Pyspark or Python and building end to end data pipeline using Azure Data bricks Experience in job scheduling using Oozie or Airflow or any other ETL scheduler Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. Good experience in designing & delivering data analytics solutions using Azure Cloud native services. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Documentation of solutions (e.g. data models, configurations, and setup). Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. Experienced in internal as well as external stakeholder management Experience in MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage. Azure Data Engineer certification is must. Databricks or Spark certification will be beneficial . Nice to have skills: Working experience with Snowflake, reporting tools like Power BI etc.