The Job logo

What

Where

Operations Manager (Snowflake)

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Tech Profile/Essential Skills

· Core Toolset: Snowflake, Marketplace, ELT, Cloud Service Provider (AWS or Azure) Administration.

· Minimum 10 years of industry experience, preferably in financial services.

· Previous management experience, working in a matrix environment.

· Strong ITIL knowledge.

· Excellent communication and interpersonal skills.

· Excellent understanding of data through previous hands-on operational support of IT systems.

· Proficient in abstracting complex, detailed orientated messages into format appropriate to target audience.

· Proactive, determined and resilient, with a positive attitude.

· Extensive and demonstrable problem-solving and decision-making capabilities.

· Strong negotiation / influencing skills able to overcome resistance and reach consensus and compromise to attain the required objective.

Detailed Responsibilities

· Work with Solution Architects and Development teams to transition solutions into Snowflake Distribution Operational Support.

· Integrate with internal governance frameworks, assist with Solution Architecture Design (SAD) Production approvals and ITSM for deliveries into Snowflake Distribution venues.

· Responsible for Snowflake Distribution venue monitoring, managing risks, issues and escalations relating to Snowflake Distribution venues, Disaster Recovery failover and failback.

· Maintain Network Topology, Operational Support, Feed Onboarding Contract, Disaster Recovery documentation in conformance with LSEG requirements meeting support and audit requirements.

· Prioritise fix on fail, essential maintenance, raise and manage ITSM tickets for delivery, escalation management and on-call rota.

· Identify opportunities for Snowflake Distribution venues to reduce risk, reduce cost, improve business efficiency and deliver business benefit.

· Contribute to the Snowflake Centre for Enablement delivering:

o Preferred integration patterns.

o Identify and promote best practice.

o Advice, guidance and assistance to disparate teams seeking integration.

o Contribute to evolving LSEG policies and standards.

· Enhance our people and build up Operational Support capability.

Set alert for similar jobsOperations Manager (Snowflake) role in Hyderabad, India
Tata Consultancy Services Logo

Company

Tata Consultancy Services

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

8-12 years

Locations

Hyderabad, Telangana, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Tata Consultancy Services Logo

Operations Support Engineer (Snowflake)

Tata Consultancy Services

Hyderabad, Telangana, India

Posted: a year ago

We are looking for a candidate with 2 years experience in financial services, proficient in Snowflake, and with good understanding of data and IT systems. Must have excellent communication skills and be proactive, determined, and resilient. Responsibilities include transitioning solutions into Snowflake Distribution, managing Snowflake Distribution venues and disaster recovery, and contributing to the Snowflake Centre for Enablement.

Tata Consultancy Services Logo

Senior Snowflake Developer

Tata Consultancy Services

Hyderabad, Telangana, India

Posted: a year ago

Tech Profile/Essential Skills · Core Toolset: Snowflake, Marketplace, ELT, CSP Cloud Service Provider (AWS or Azure) Administration. · Minimum 2 years of industry experience, preferably in financial services. · Excellent communication and interpersonal skills. · Good understanding of data through previous hands-on contributions to design, development and implementation of IT systems. · Proactive, determined and resilient, with a positive attitude. · Extensive and demonstrable problem-solving and decision-making capabilities. Detailed Responsibilities · Design, develop and implement approved capability using agreed tooling and techniques into key LSEG deliveries. · Ensure all deliverables conform to security, infrastructure, product and other internal requirements. · Maintain system documentation in conformance with LSEG requirements. · Identify opportunities for Snowflake Distribution venues to reduce risk, reduce cost, improve business efficiency and deliver business benefit. · Contribute to the Snowflake Centre for Enablement delivering: o Preferred integration patterns. o Identify and promote best practice. o Advice, guidance and assistance to disparate teams seeking integration. o Contribute to evolving LSEG policies and standards. · Work with Snowflake technical delivery teams on product enhancements delivering successful business outcomes for LSEG. · Ensure that all processes and documentation are adequate and effective to meet support and audit requirements. · Provide frequent status reporting to Stakeholders.

Tata Consultancy Services Logo

Development Manager

Tata Consultancy Services

Hyderabad, Telangana, India

Posted: a year ago

Desired Experience Range:  10 years to 15 Years   Must-Have  Core Toolset: Snowflake, Marketplace, ELT, Cloud Service Provider (AWS or Azure) Administration. Minimum 10 years of industry experience, preferably in financial services. Previous management experience, working in a matrix environment. Strong ITIL knowledge. Excellent communication and interpersonal skills. Excellent understanding of data through previous hands-on design, development and implementation of IT systems. Proficient in abstracting complex, detailed orientated messages into format appropriate to target audience. Proactive, determined and resilient, with a positive attitude. Extensive and demonstrable problem-solving and decision-making capabilities. Strong negotiation / influencing skills able to overcome resistance and reach consensus and compromise to attain the required objective. Roles & Responsibilities:   Work with Solution Architect to ensure System Architecture Designs (SAD) meet all Snowflake Distribution requirements including security, infrastructure, product and other internal requirements. Via internal governance frameworks, assist with SAD approvals, then design, develop and implement approved capability using agreed tooling and techniques into Snowflake Distribution delivery venues. Assign team members to development work, manage according to delivery plan, liaise with Project Managers and provide weekly updates. Maintain Network Topology, Operational Support, Feed Onboarding Contract documentation in conformance with LSEG requirements. Prioritise fix on fail, essential maintenance, raise and manage ITSM tickets for delivery, escalation management. Identify opportunities for Snowflake Distribution venues to reduce risk, reduce cost, improve business efficiency and deliver business benefit. Technical expert in Snowflake and competent in other tooling. Contribute to the Snowflake Centre for Enablement delivering:        o Preferred integration patterns.        o Identify and promote best practice.        o Advice, guidance and assistance to disparate teams seeking integration        o Contribute to evolving LSEG policies and standards. Ensure that all processes and documentation are adequate and effective to meet support and audit requirements. Provide frequent status reporting to Management. Enhance our people and showcase Snowflake Distribution capability to colleagues.

PepsiCo Logo

Enterprise Data Operations Associate Manager

PepsiCo

Hyderabad, Telangana, India

Posted: a year ago

Overview PepsiCo operates in an environment undergoing immense and rapid change.  Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT.  The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics and new product development.  PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company Responsible for day-to-day data collection, transportation, maintenance/curation and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders Increase awareness about available data and democratize access to it across the company                  As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process Responsibilities Active contributor to code development in projects and services. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to “productionalize” data science models. Define and manage SLA’s for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 11+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Salesforce Cloud Technologies is must. 4+ years of experience with Sales force Customer data modeling, data warehousing, and building high-volume ETL/ELT pipelines. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or SnowFlake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields. Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to lead others without direct authority in a matrixed environment. Salesforce Data Cloud Accreditation Relevant Salesforce certifications and consulting experience are strongly recommended Familiarity with Data Regulation