The Job logo

What

Where

Big Data Engineer

Apply

You must Sign In before continuing to the company website to apply.

Job Description

Job Title: Big Data Engineer

Job Description:

  • 5+ years of IT experience with in-depth big data technology
  • Conceptualize and design big data system involving Hadoop, PySpark, Hive.
  • Extensively worked on Python Programming
  • Extensively worked on Spark and PySpark language
  • Knowledge on Cluster management and storage mechanism in Big Data Cloud
  • Design functional and technical architectures
  • Develop applications on the big data platform using open source programming languages
  • Hands-on experience with fundamental Hadoop tools and technologies
  • Work closely with Administrator, Architect and Developers

Essential Skills:

  • Strong analytical skills
  • Strong framework design skills
  • Strong code standardization skills

Experience/Background

  • 7+ years software development experience

What we Offer:

  • Join the success story of a globally trusted brand
  • Partnering with well known clients
  • Being part and drive high performance teams
  • Multiple international career development options
  • Extensive training and learning opportunities
  • Great team environment
  • Regularly corporate events
  • Flexible work options

Big Data Admin

Wipro Logo

Company

Wipro

Job Posted

9 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

8-12 years

Locations

Zurich, District Zurich, Zurich, Switzerland

Applicants

Be an early applicant

Related Jobs

EPAM Systems Logo

Big Data Solution Architect

EPAM Systems

Bangalore Urban, Karnataka, India

Posted: 7 months ago

JOB DESCRIPTION We are looking for Solution Architects for data-driven projects to join our Data Practice team in India. Together we design and drive lots of solutions that generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms. We provide a solid architecture framework, educational programs, and a strong SA community to support our new Architects in a deep dive into the data domain. Responsibilities Design data analytics solutions by utilising the big data technology stack Create and present solution architecture documents with deep technical details Work closely with business in identifying solution requirements and key case-studies/scenarios for future solutions Conduct solution architecture review/audit, calculate and present ROI Lead implementation of the solutions from establishing project requirements and goals to solution "go-live" Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives Create and follow personal education plan in the technology stack and solution architecture Maintain a strong understanding of industry trends and best practices Get involved in engaging new clients to further drive EPAM business in the big data space Requirements Should have minimum experience of 12+ yrs Strong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python Experience delivering data analytics projects and architecture guidelines Experience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud) Production project experience in at least one of the big data technologies Batch processing: Hadoop and MapReduce/Spark/Hive NoSQL databases: Cassandra/HBase/Accumulo/Kudu Knowledge of Agile development methodology, Scrum in particular Experience in direct customer communications and pre-selling business-consulting engagements to clients within large enterprise environments Experience working within a consulting business and pre-sales experience would be highly attractive We Offer Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Unlimited access to LinkedIn learning solutions Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: Health benefits, Retirement benefits, Paid time off, Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)

Tata Consultancy Services Logo

Big data Scala Developer

Tata Consultancy Services

Hyderabad, Telangana, India

Posted: 8 months ago

Role Name: Big data Scala Developer Desired Experience Range:   5 years to 10 Years Role Profile  Under general direction the selected candidate will serve as the Java/Scala Developer responsible for developing, supporting Big Data and Analytic financial applications. We are looking for a person that wants to contribute to a robust team of developers, and continuously deepen skills. You will join a vibrant and high performing team of outstanding technologists, executing on a Big Data initiative that is set to revolutionize our operation both internally and for our customers. This is an opportunity to build skills in Big Data technologies like Scala, experience in these is not essential. Training will be available for the successful candidate. You will need to:  Contribute to an Agile planning process  Regularly interact directly with product stakeholders  Give demos of new functionality  Uphold rigorous quality standards  Perform Unit and Integration testing of your code  Undertake a 3rd line support role  Preferred Skills and Experience 5+ years of experience in Java technologies Exemplary Scala and Java programming skills  Good understanding of functional programming paradigms  Continuous Integration (e.g. Gitlab, Bamboo, Jenkins)  2+ years of experience with Big data technologies, specifically Hadoop, Scala, Spark, Impala, Hive, Kafka, HBase & Oozie. Detailed Responsibilities  Experience using SBT, Gradle build tools  Proficient with SQL  Automated deployment (Puppet, RPM)  Automated testing techniques  Proficient in Linux/Unix Understanding of the Financial Markets and their practices 

Infosys Logo

Big Data Developer

Infosys

Bengaluru, Karnataka, India

Posted: 6 months ago

Responsibilities A day in the life of an Infoscion  • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.  • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.  • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.  • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.  • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!   Educational Requirements Bachelor of Engineering   Service Line Data & Analytics Unit   Additional Responsibilities: Knowledge of more than one technology  • Basics of Architecture and Design fundamentals  • Knowledge of Testing tools  • Knowledge of agile methodologies  • Understanding of Project life cycle activities on development and maintenance projects  • Understanding of one or more Estimation methodologies, Knowledge of Quality processes  • Basics of business domain to understand the business requirements  • Analytical abilities, Strong Technical Skills, Good communication skills  • Good understanding of the technology and domain  • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods  • Awareness of latest technologies and trends  • Excellent problem solving, analytical and debugging skills   Technical and Professional Requirements: Primary skills:Bigdata,Bigdata->Hadoop,Bigdata->Hive,Bigdata->Pyspark,Bigdata->Python,Bigdata->Scala,Bigdata->Spark,Opensource->Apache Kafka   Preferred Skills: Bigdata->Hive Opensource->Apache Kafka Bigdata->Hadoop Bigdata->Pyspark Bigdata->Scala Bigdata->Spark Bigdata

Publicis Sapient Logo

Manager Data Engineering DE - Big Data Azure

Publicis Sapient

Bengaluru, Karnataka, India

Posted: 8 months ago

Job Description As Manager, Data Engineering, you will be responsible for translating client requirements into design, architecting, and implementing Azure Cloud-based big data solutions for clients. Your role will be focused on delivering high-quality solutions by independently driving design discussions related to Data Ingestion, Transformation & Consumption, Data Storage and Computation Frameworks, Performance Optimizations, Infrastructure, Automation & Cloud Computing, and Data Governance & Security. The role requires a hands-on technologist with expertise in Big Data solution architecture and with a strong programming background in Java / Scala / Python. Your Impact: Provide technical leadership and hands-on implementation role in the areas of data engineering including data ingestion, data access, modeling, data processing, visualization, design, and implementation. Lead a team to deliver high quality big data technologies-based solutions on Azure Cloud. Manage functional & nonfunctional scope and quality. Help establish standard data practices like governance and address other non-functional issues like data security, privacy, and quality. Manage and provide technical leadership to a data program implementation based on the requirement using agile technologies. Participate in workshops with clients and align client stakeholders to optimal solutions. Consulting, Soft Skills, Thought Leadership, Mentorship etc. People management, contributing to hiring and capability building. #LI-REMOTE  Qualifications Your Skills & Experience: Overall 8+ years of IT experience with 3+ years in Data related technologies, and expertise of 1+ years in data-related Azure Cloud services and delivered at least 1 project as an architect. Mandatory to have knowledge of Big Data Architecture Patterns and experience in the delivery of end-to-end Big Data solutions on Azure Cloud. Expert in programming languages like Java/ Scala and good to have Python Expert in at least one distributed data processing framework: Spark (Core, Streaming, SQL), Storm or Flink, etc. Expert in Hadoop eco-system with Azure cloud distribution and worked at least on one or more big data ingestion tools (Sqoop, Flume, NiFI, etc), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub, etc) and good to know traditional tools like Informatica, Talend, etc Should have worked on any NoSQL solutions like Mongo DB, Cassandra, HBase, etc, or Cloud-based NoSQL offerings like DynamoDB, Big Table, etc. Good Exposure in development with CI / CD pipelines. Knowledge of containerization, orchestration, and Kubernetes engine would be an added advantage. Set Yourself Apart With: Certification on Azure cloud platform or big data technologies. Strong analytical and problem-solving skills. Excellent understanding of data technologies landscape/ecosystem. A Tip from the Hiring Manager: Join the team to sharpen your skills and expand your collaborative methods. Make an impact on our clients and their businesses directly through your work. Additional Information Gender Neutral Policy 18 paid holidays throughout the year Generous parental leave and new parent transition program Flexible work arrangements  Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting, and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of the next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.