The Job logo



Camunda/ BPM-Angular-Nodejs

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.




• Develops re-usable and generic services / APIs to be used within DIP based on business need

• Should have good hands on experience in Angular+, Node JS

Mandatory Skills- Angular10+, Javascript, HTML, CSS, Camunda/ BPM

Good to have- Mongo DB, Java

• Develops other platform capabilities using agreed technologies aligning with DIP architecture

•  Hands experience using CAMUNDA BPM

Exp – 5+ years
NP- 0-45 days

• Ensures alignment with DIP architecture, design, and best practices

  • Good to have Mongo DB Application Development.
  • Should be able to work independently on the tasks with technical description
  • Good to have Java development knowledge
  • Required Tech stack :


Primary Location

: India-Maharashtra-Pune

Experience Required (In Years): Minimum- 5 Maximum- 10

Set alert for similar jobsCamunda/ BPM-Angular-Nodejs role in Pune, India
Zensar Technologies Logo


Zensar Technologies

Job Posted

9 months ago

Job Type




Experience Level

3-7 years


Software Engineering


Pune, Maharashtra, India




Be an early applicant

Related Jobs

Zensar Technologies Logo

Guidewire Portals Testing

Zensar Technologies

Pune, Maharashtra, India

Posted: 9 months ago

We are looking for a candidate with at least 5 years of experience in Guidewire Digital Portals, CustomerEngage, and ProducerEngage. You will be responsible for testing integrations with CC, PC, and other external systems. Additionally, you will be managing testing and related processes for Portal development projects and ongoing enhancements. As part of your role, you will be responsible for developing end-to-end functional test plans, working closely with business analysts and developers to identify, script, and execute functional test cases. You will also be responsible for developing configuration, regression, system, and integration test plans, as well as user acceptance test plans. In order to ensure adequate coverage of end user business requirements, you will need to develop a user acceptance test plan. It will be important for you to track issues identified in testing and report defects to developers and product engineers. Finally, you will be responsible for developing QA key measures and dashboards. This position is located in India, Maharashtra, Pune.

Zensar Technologies Logo

Snowflake, Databricks or Apache Spark

Zensar Technologies

Pune, Maharashtra, India

Posted: 9 months ago

Description   Roles and Responsibilites  •    Managing the entire development lifecycle, from requirements gathering to deployment and maintenance •    Developing and implementing analytics strategies and roadmaps •    Collaborating with senior business leaders to align analytics initiatives with business goals •    Identifying and evaluating new technologies and trends •    Leading large-scale data engineering projects and managing cross-functional teams •    Setting technical direction and driving innovation in data infrastructure and analytics solutions •    Developing and implementing data architecture and modeling standards and best practices •    Collaborating with executive leadership to align data infrastructure and analytics initiatives with business goals   Primary Skills Strong experience in leading large-scale data projects In-depth understanding of data governance, data architecture, and data modeling Strong business acumen and ability to translate business requirements into technical solutions. Extensive experience in data engineering and data architecture at scale In-depth understanding of data security, privacy, and compliance requirements Strong business acumen and ability to translate business requirements into technical solutions. Experience with data storage and retrieval technologies, SQL and NoSQL Experience with data processing frameworks such as Snowflake, Databricks or Apache Spark "Bachelor's degree in Computer Science, Information Systems, Business Management (desirable) or specialized training/certification (at least 1 year) or equivalent work experience" (at least 10 years) Typically requires 7-10 years’ of related technical experience Secondary Skills Proven track record in developing and implementing successful analytics strategies Experience with modern data platforms and technologies such as serverless computing, containerization, and microservices Experience with cloud-based data platforms such as AWS, GCP, or Azure   Primary Location :  India-Maharashtra-Pune Experience Required (In Years):   Minimum-  7  Maximum-  10

Zensar Technologies Logo

Azure Data Bricks

Zensar Technologies

Pune, Maharashtra, India

Posted: 9 months ago

What's this role about? As an Azure Technical Specialist, you will be responsible for migration to Azure Data Cloud or implementing data solutions using Azure Data Cloud services including integration with existing data and analytics platforms and tools.   Here's how you'll contribute: You will contribute to the role by: Creating Impact faster: Delivering more Impact In less time by quickly deploying solutions or augmenting existing ones to enable teams to accelerate business results and Increase time to value Breaking through barriers: Helping create better customer experience by taking the data first approach and deliver Insights Adapting to anything: Agility to reacting and responding to new business priorities and market conditions and customer opportunities with rapidly deployable solutions Innovating anywhere: Solving problems with powerful solutions enabling Inter operable solutions across multiple lines of business   Skills required to contribute: 7-9 Years of Data and Analytics experience with minimum 4 years in Azure Cloud Excellent communication and presentation skills. Extensive experience in Azure stack – Azure Data bricks, Azure Synapse, ADLS, Azure SQL DB, Azure Data Factory, CosmoDB, Analysis Services, Event Hub etc.. Excellent experience in data processing using Azure Data bricks, complex data transformation using Pyspark or Python and building end to end data pipeline using Azure Data bricks Experience in job scheduling using Oozie or Airflow or any other ETL scheduler Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. Good experience in designing & delivering data analytics solutions using Azure Cloud native services. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Documentation of solutions (e.g. data models, configurations, and setup). Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. Experienced in internal as well as external stakeholder management Experience in MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage. Azure Data Engineer certification is must. Databricks or Spark certification will be beneficial . Nice to have skills: Working experience with Snowflake, reporting tools like Power BI etc.