The Job logo

What

Where

PySpark Developer

Apply

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
We are looking for a candidate who has the ability to analyze data using PySpark and SQL. You should be comfortable working in an Agile environment and have experience in following the end to end development cycle including code standards, unit testing, and integration testing. Primary skills required for this role are PySpark and SQL, while secondary skills include knowledge of Cloud Computing (GCP), Data Modelling, and exposure to Data Quality assessment.

Job Description

  • Ability to analyze data using PySpark and SQL
  • Able to work in Agile environment
  • Ability to follow the end to end development cycle including following code standards, unit testing, integration testing etc.

Primary Skill

  • Pyspark 
  • SQL 

Secondary Skill

  • Knowledge of Cloud Computing (GCP), Data Modelling, exposure to Data Quality assessment
Capgemini Logo

Company

Capgemini

Job Posted

9 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 years

Category

Finance
Services

Locations

Chennai, Tamil Nadu, India

Hyderabad, Telangana, India

Bengaluru, Karnataka, India

Pune, Maharashtra, India

Applicants

Be an early applicant

Related Jobs

Tata Consultancy Services Logo

Pyspark Developer

Tata Consultancy Services

Chennai, Tamil Nadu, India

Posted: 6 months ago

Looking for a candidate with expertise in PySpark, Spark SQL, and working with dataframes. Must have knowledge of advanced data transformations and good understanding of SQL. The candidate should be experienced in error handling, logging, and monitoring. Knowledge of Unix and a scheduling tool is necessary. Performance tuning and generic process development skills are required. Familiarity with the banking domain is preferred. Experience in ETL estimation and Teradata BTEQ is a plus.