The Job logo

What

Where

Big Data Engineer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Job Description

Job Title: Big Data Engineer

Job Description:

  • 5+ years of IT experience with in-depth big data technology
  • Conceptualize and design big data system involving Hadoop, PySpark, Hive.
  • Extensively worked on Python Programming
  • Extensively worked on Spark and PySpark language
  • Knowledge on Cluster management and storage mechanism in Big Data Cloud
  • Design functional and technical architectures
  • Develop applications on the big data platform using open source programming languages
  • Hands-on experience with fundamental Hadoop tools and technologies
  • Work closely with Administrator, Architect and Developers

Essential Skills:

  • Strong analytical skills
  • Strong framework design skills
  • Strong code standardization skills

Experience/Background

  • 7+ years software development experience

What we Offer:

  • Join the success story of a globally trusted brand
  • Partnering with well known clients
  • Being part and drive high performance teams
  • Multiple international career development options
  • Extensive training and learning opportunities
  • Great team environment
  • Regularly corporate events
  • Flexible work options

Big Data Admin

Set alert for similar jobsBig Data Engineer role in Switzerland
Wipro Logo

Company

Wipro

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

8-12 years

Locations

Zurich, District Zurich, Zurich, Switzerland

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Hexaware Technologies Logo

Big Data Engineer

Hexaware Technologies

Bengaluru, Karnataka, India

Posted: a month ago

Join Hexaware Technologies as Big Data Engineer working on Azure ADF, Databricks, and Pyspark. Requires hands-on experience in data transformations, DWH concepts, and writing complex SQL queries. Must have knowledge of Azure services like Azure SQL, Azure blob storage, and Azure logic apps. Coordination with business stakeholders and ability to work independently on DevOps and Agile projects is crucial. Full-time On-site opportunity in Bengaluru, Karnataka, India.

Tata Consultancy Services Logo

Big data Scala Developer

Tata Consultancy Services

Hyderabad, Telangana, India

Posted: a year ago

Role Name: Big data Scala Developer Desired Experience Range:   5 years to 10 Years Role Profile  Under general direction the selected candidate will serve as the Java/Scala Developer responsible for developing, supporting Big Data and Analytic financial applications. We are looking for a person that wants to contribute to a robust team of developers, and continuously deepen skills. You will join a vibrant and high performing team of outstanding technologists, executing on a Big Data initiative that is set to revolutionize our operation both internally and for our customers. This is an opportunity to build skills in Big Data technologies like Scala, experience in these is not essential. Training will be available for the successful candidate. You will need to:  Contribute to an Agile planning process  Regularly interact directly with product stakeholders  Give demos of new functionality  Uphold rigorous quality standards  Perform Unit and Integration testing of your code  Undertake a 3rd line support role  Preferred Skills and Experience 5+ years of experience in Java technologies Exemplary Scala and Java programming skills  Good understanding of functional programming paradigms  Continuous Integration (e.g. Gitlab, Bamboo, Jenkins)  2+ years of experience with Big data technologies, specifically Hadoop, Scala, Spark, Impala, Hive, Kafka, HBase & Oozie. Detailed Responsibilities  Experience using SBT, Gradle build tools  Proficient with SQL  Automated deployment (Puppet, RPM)  Automated testing techniques  Proficient in Linux/Unix Understanding of the Financial Markets and their practices 

EPAM Systems Logo

Big Data Solution Architect

EPAM Systems

Bangalore Urban, Karnataka, India

Posted: a year ago

JOB DESCRIPTION We are looking for Solution Architects for data-driven projects to join our Data Practice team in India. Together we design and drive lots of solutions that generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms. We provide a solid architecture framework, educational programs, and a strong SA community to support our new Architects in a deep dive into the data domain. Responsibilities Design data analytics solutions by utilising the big data technology stack Create and present solution architecture documents with deep technical details Work closely with business in identifying solution requirements and key case-studies/scenarios for future solutions Conduct solution architecture review/audit, calculate and present ROI Lead implementation of the solutions from establishing project requirements and goals to solution "go-live" Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives Create and follow personal education plan in the technology stack and solution architecture Maintain a strong understanding of industry trends and best practices Get involved in engaging new clients to further drive EPAM business in the big data space Requirements Should have minimum experience of 12+ yrs Strong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python Experience delivering data analytics projects and architecture guidelines Experience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud) Production project experience in at least one of the big data technologies Batch processing: Hadoop and MapReduce/Spark/Hive NoSQL databases: Cassandra/HBase/Accumulo/Kudu Knowledge of Agile development methodology, Scrum in particular Experience in direct customer communications and pre-selling business-consulting engagements to clients within large enterprise environments Experience working within a consulting business and pre-sales experience would be highly attractive We Offer Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Unlimited access to LinkedIn learning solutions Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: Health benefits, Retirement benefits, Paid time off, Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)

Infosys Logo

Big Data Developer

Infosys

Bengaluru, Karnataka, India

Posted: 9 months ago

Responsibilities A day in the life of an Infoscion  • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.  • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.  • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.  • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.  • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!   Educational Requirements Bachelor of Engineering   Service Line Data & Analytics Unit   Additional Responsibilities: Knowledge of more than one technology  • Basics of Architecture and Design fundamentals  • Knowledge of Testing tools  • Knowledge of agile methodologies  • Understanding of Project life cycle activities on development and maintenance projects  • Understanding of one or more Estimation methodologies, Knowledge of Quality processes  • Basics of business domain to understand the business requirements  • Analytical abilities, Strong Technical Skills, Good communication skills  • Good understanding of the technology and domain  • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods  • Awareness of latest technologies and trends  • Excellent problem solving, analytical and debugging skills   Technical and Professional Requirements: Primary skills:Bigdata,Bigdata->Hadoop,Bigdata->Hive,Bigdata->Pyspark,Bigdata->Python,Bigdata->Scala,Bigdata->Spark,Opensource->Apache Kafka   Preferred Skills: Bigdata->Hive Opensource->Apache Kafka Bigdata->Hadoop Bigdata->Pyspark Bigdata->Scala Bigdata->Spark Bigdata