The Job logo

What

Where

Big Data Engineer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Job Description

Job Title: Big Data Engineer

Job Description:

  • 5+ years of IT experience with in-depth big data technology
  • Conceptualize and design big data system involving Hadoop, PySpark, Hive.
  • Extensively worked on Python Programming
  • Extensively worked on Spark and PySpark language
  • Knowledge on Cluster management and storage mechanism in Big Data Cloud
  • Design functional and technical architectures
  • Develop applications on the big data platform using open source programming languages
  • Hands-on experience with fundamental Hadoop tools and technologies
  • Work closely with Administrator, Architect and Developers

Essential Skills:

  • Strong analytical skills
  • Strong framework design skills
  • Strong code standardization skills

Experience/Background

  • 7+ years software development experience

What we Offer:

  • Join the success story of a globally trusted brand
  • Partnering with well known clients
  • Being part and drive high performance teams
  • Multiple international career development options
  • Extensive training and learning opportunities
  • Great team environment
  • Regularly corporate events
  • Flexible work options

Big Data Admin

Set alert for similar jobsBig Data Engineer role in Switzerland
Wipro Logo

Company

Wipro

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

8-12 years

Locations

Zurich, District Zurich, Zurich, Switzerland

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Hexaware Technologies Logo

Big Data Engineer

Hexaware Technologies

Bengaluru, Karnataka, India

Posted: 5 months ago

Join Hexaware Technologies as Big Data Engineer working on Azure ADF, Databricks, and Pyspark. Requires hands-on experience in data transformations, DWH concepts, and writing complex SQL queries. Must have knowledge of Azure services like Azure SQL, Azure blob storage, and Azure logic apps. Coordination with business stakeholders and ability to work independently on DevOps and Agile projects is crucial. Full-time On-site opportunity in Bengaluru, Karnataka, India.

EPAM Systems Logo

Big Data Solution Architect

EPAM Systems

Bangalore Urban, Karnataka, India

Posted: a year ago

JOB DESCRIPTION We are looking for Solution Architects for data-driven projects to join our Data Practice team in India. Together we design and drive lots of solutions that generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms. We provide a solid architecture framework, educational programs, and a strong SA community to support our new Architects in a deep dive into the data domain. Responsibilities Design data analytics solutions by utilising the big data technology stack Create and present solution architecture documents with deep technical details Work closely with business in identifying solution requirements and key case-studies/scenarios for future solutions Conduct solution architecture review/audit, calculate and present ROI Lead implementation of the solutions from establishing project requirements and goals to solution "go-live" Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives Create and follow personal education plan in the technology stack and solution architecture Maintain a strong understanding of industry trends and best practices Get involved in engaging new clients to further drive EPAM business in the big data space Requirements Should have minimum experience of 12+ yrs Strong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python Experience delivering data analytics projects and architecture guidelines Experience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud) Production project experience in at least one of the big data technologies Batch processing: Hadoop and MapReduce/Spark/Hive NoSQL databases: Cassandra/HBase/Accumulo/Kudu Knowledge of Agile development methodology, Scrum in particular Experience in direct customer communications and pre-selling business-consulting engagements to clients within large enterprise environments Experience working within a consulting business and pre-sales experience would be highly attractive We Offer Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Unlimited access to LinkedIn learning solutions Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: Health benefits, Retirement benefits, Paid time off, Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)

Tata Consultancy Services Logo

Big data Scala Developer

Tata Consultancy Services

Hyderabad, Telangana, India

Posted: a year ago

Role Name: Big data Scala Developer Desired Experience Range:   5 years to 10 Years Role Profile  Under general direction the selected candidate will serve as the Java/Scala Developer responsible for developing, supporting Big Data and Analytic financial applications. We are looking for a person that wants to contribute to a robust team of developers, and continuously deepen skills. You will join a vibrant and high performing team of outstanding technologists, executing on a Big Data initiative that is set to revolutionize our operation both internally and for our customers. This is an opportunity to build skills in Big Data technologies like Scala, experience in these is not essential. Training will be available for the successful candidate. You will need to:  Contribute to an Agile planning process  Regularly interact directly with product stakeholders  Give demos of new functionality  Uphold rigorous quality standards  Perform Unit and Integration testing of your code  Undertake a 3rd line support role  Preferred Skills and Experience 5+ years of experience in Java technologies Exemplary Scala and Java programming skills  Good understanding of functional programming paradigms  Continuous Integration (e.g. Gitlab, Bamboo, Jenkins)  2+ years of experience with Big data technologies, specifically Hadoop, Scala, Spark, Impala, Hive, Kafka, HBase & Oozie. Detailed Responsibilities  Experience using SBT, Gradle build tools  Proficient with SQL  Automated deployment (Puppet, RPM)  Automated testing techniques  Proficient in Linux/Unix Understanding of the Financial Markets and their practices