Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8 - 13 years
12 - 22 Lacs
Hyderabad
Work from Office
About Company: Niha Technologies Inc. is a leading provider of IT Solutions across various industries These services include Enterprise Resource Planning, Business Process Management, Business Integration Services, Service Oriented Architecture, Business Activity Monitoring, Enterprise Portals and Cloud Solutions. We use customer-centric, value-driven project management and solution-delivery methodologies to deliver Fixed-Price, Fixed-Time solutions to its customers in Banking/Financial Services, Retail, Process Manufacturing, Oil and Gas, Utilities and B2B Marketplaces industries. Niha Technologies Inc achieves excellence through continuous learning, innovative solutions and strategic partnerships with leading software companies e.g., SAP, IBM, Microsoft and Oracle to provide the best value to our customers with a professional difference. Practicing our own methodology refined over several years of IT project implementation experience, leveraging Strategic Technical Application Techniques & Tactics, which serves as the backbone and unifies its consulting offerings with consistent communication, with template-based modular design, reusable, object-based processes and Rapid Design modelling. Job Title: AWS-AI ML Data Engineer Must have 7+ years of experience in Data Engineering and Development. 5+ years of experience in SQL/RDBMS. 5+ years of demonstrable experience in Python coding language, Pyspark and Analytics. At least 3 years of hands-on experience on AWS cloud (services including Lambda, Glue, ECS, Step-functions, API Gateway, RDS, SQS and DynamoDB). Experience with AI based skills such as working with models LLM etc. Nice to have Pharma domain experience is a plus. Knowledge of Git + CI/CD and relevant tools (Gitlab preferred). Knowledge of IAAC (Terraform preferred). Knowledge of Snowflake and dbt is a plus. Knowledge of Java.
Posted 2 months ago
8 - 10 years
16 - 30 Lacs
Delhi NCR, Gurgaon, Noida
Hybrid
WHAT YOULL DO • Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, and airflow. • Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. • Actively monitor and triage technical challenges in critical situations that require immediate resolution. • Evaluate viable technical solutions and share MVPs or PoCs in support of the research • Develop relationships with external stakeholders to maintain awareness of data and security issues and trends • Review work from other tech team members and provide feedback for growth • Implement Data Performance and data security policies that align with governance objectives and regulatory requirements • Effectively mentor and develop your team members YOURE GOOD AT You have experience in data warehousing, data modeling, and the building of data engineering pipelines. You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling. You are good at analyzing performance bottlenecks and providing enhancement recommendations; you have a passion for customer service and a desire to learn and grow as a professional and a technologist. • Strong analytical skills related to working with structured, semi-structured, and unstructured datasets. • Collaborating with product owners to identify requirements, define desired outcomes, and deliver trusted results. • Building processes supporting data transformation, data structures, metadata, dependency, and workload management. • In this role, SQL is heavily focused. An ideal candidate must have hands-on experience with SQL database design. Plus, Python. • Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses (Snowflake preferred). • Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) • Extremely talented in applying SCD, CDC, and DQ/DV framework. • Familiar with JIRA & Confluence. • Must have exposure to technologies such as dbt, Apache airflow, and Snowflake. • Desire to continually keep up with advancements in data engineering practices. • Knowledge of AWS cloud, and Python is a plus. YOU BRING (EXPERIENCE & QUALIFICATIONS) Essential Education • Bachelor's degree or equivalent combination of education and experience. • Bachelor's degree in information science, data management, computer science or related field preferred. • Essential Experience & Job Requirements • 7+ years of IT experience with a major focus on data warehouse/database-related projects • Must have exposure to technologies such as dbt, Apache Airflow, and Snowflake. • Experience in other data platforms: Oracle, SQL Server, MDM, etc • Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc. • Experience in data modeling and relational database design • Well-versed in applying SCD, CDC, and DQ/DV framework. • Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) • Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake • Good to have strong programming/ scripting skills (Python, PowerShell, etc.) • Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) • Excellent written, and oral communication and presentation skills to present architecture, features, and solution recommendations
Posted 2 months ago
5 - 8 years
13 - 19 Lacs
Bengaluru, Kolkata, Mumbai (All Areas)
Work from Office
Dear Candidate, We have an urgent openings for Snowflake developer- Mumbai/Bangalore/Kolkata/Gurgaon Exp-5 to 8 yrs Np- immediate joiner should have exp on snowflake .
Posted 2 months ago
5 - 10 years
15 - 25 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
We are looking for Reactjs Developer for a global leader providing assurance, consulting, strategy, transactions, and tax services. The firm's consistent delivery of high-quality services makes it a trusted partner for businesses around the world. Organization is employee friendly and believes in growing together. Job Title - ETL Informatica BI Developer Location - Mumbai, Bangalore, Pune, Hyderbad Full Time Experience - 5 to 12 Years Salary - Competitive Notice Period - Immediate to 15 Days Hybrid Skills : Primary - SQL + Hadoop + Python + Informatica Secondary - Unix + Reporting tool (SAP BO/Power BI) Good to have - Exposure to cloud technologies and Snowflake If you are interested in exploring this opportunity, kindly share your resume at ekta@digitalxnode.com Warm Regards
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Hyderabad
Remote
Key Responsibilities: Snowflake Architecture & Setup : Design and implement Snowflake environments , ensuring best practices in RBAC, network security policies, and external access integrations . Iceberg Catalog Implementation : Configure and manage Apache Iceberg catalogs within Snowflake and integrate with Azure ADLS Gen2 for external storage. External Storage & Access : Set up external tables, storage integrations , and access policies for ADLS Gen2, AWS S3, and GCS . Data Ingestion & Streaming : Implement Snowpipe, Dynamic Tables , and batch/streaming ETL pipelines for real-time and scheduled data processing. CI/CD & Automation : Develop CI/CD pipelines for Snowflake schema changes, security updates, and data workflows using Terraform, dbt, GitHub Actions, or Azure DevOps . Snowflake Notebooks & Snowpark : Utilize Snowflake Notebooks for analytics and data exploration, and develop Snowpark applications for machine learning and complex data transformations using Python, Java, or Scala . Security & Compliance : Implement RBAC, Okta SSO authentication, OAuth, network security policies, and governance frameworks for Snowflake environments. Notification & Monitoring Integration : Set up event-driven notifications and alerting using Azure Event Grid, SNS, or cloud-native services . Performance & Cost Optimization : Continuously monitor query performance, warehouse utilization, cost estimates, and optimizations to improve efficiency. Documentation & Best Practices : Define best practices for Snowflake architecture, automation, security, and performance tuning . Required Skills & Experience: 7+ years of experience in data architecture and engineering, specializing in Snowflake Expertise in SQL, Python, and Snowpark APIs. Hands-on experience with Iceberg Catalogs, Snowflake Notebooks, and external storage (Azure ADLS Gen2, S3, GCS). Strong understanding of CI/CD for Snowflake , including automation with Terraform, dbt, and DevOps tools. Experience with Snowpipe, Dynamic Tables, and real-time/batch ingestion pipelines. Proven ability to analyze and optimize Snowflake performance, storage costs, and compute efficiency. Knowledge of Okta SSO, OAuth, federated authentication, and network security in Snowflake. Cloud experience in Azure, AWS, or GCP , including cloud networking and security configurations. Additional Details: This is a Contractual position for a duration of 6-12 months, This is a Completely Remote Opportunity .
Posted 2 months ago
4 - 9 years
5 - 15 Lacs
Pune, Bengaluru, Hyderabad
Hybrid
Job Title: Snowflake Developer with PL/SQL Exp - 4+ Years Location - Bangalore/Hyderabad/Pune Job Description: We are looking for a highly skilled Snowflake Developer with strong expertise in PL/SQL to join our team. In this role, you will design, implement, and maintain data pipelines and solutions in the Snowflake data platform. You will work closely with other teams to ensure seamless data integration, optimize query performance, and create efficient ETL processes using Snowflake and PL/SQL. Requirements: Proven experience with Snowflake data platform and SQL-based development. Strong expertise in PL/SQL, with experience writing complex stored procedures and functions. Hands-on experience in developing and optimizing ETL processes. Familiarity with data modeling, data warehousing, and data integration best practices. Knowledge of cloud data platforms, data migration, and data transformation techniques. Excellent problem-solving skills and the ability to work in a collaborative environment. Strong communication skills to work effectively with cross-functional teams. Preferred Qualifications: Experience with Snowflake data sharing and Snowflake-specific features (e.g., Snowpipe, Streams, Tasks). Familiarity with cloud computing platforms like AWS, Azure, or Google Cloud added advantage. Interested can share your to saritha.yennapally@relanto.ai or you can refer if someone is looking for job change/job need. Regards, TA Team
Posted 2 months ago
5 - 10 years
13 - 23 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 2 months ago
6 - 9 years
8 - 11 Lacs
Hyderabad
Work from Office
Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. Responsibilities Be a founding member of the data engineering team. Help to attract talent to the team by networking with your peers, by representing PepsiCo HBS at conferences and other events, and by discussing our values and best practices when interviewing candidates. Own data pipeline development end-to-end, spanning data modeling, testing, scalability, operability and ongoing metrics. Ensure that we build high quality software by reviewing peer code check-ins. Define best practices for product development, engineering, and coding as part of a world class engineering team. Collaborate in architecture discussions and architectural decision making that is part of continually improving and expanding these platforms. Lead feature development in collaboration with other engineers; validate requirements / stories, assess current system capabilities, and decompose feature requirements into engineering tasks. Focus on delivering high quality data pipelines and tools through careful analysis of system capabilities and feature requests, peer reviews, test automation, and collaboration with other engineers. Develop software in short iterations to quickly add business value. Introduce new tools / practices to improve data and code quality; this includes researching / sourcing 3rd party tools and libraries, as well as developing tools in-house to improve workflow and quality for all data engineers. Support data pipelines developed by your teamthrough good exception handling, monitoring, and when needed by debugging production issues. Qualifications 6-9 years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience in SQL optimization and performance tuning Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with data profiling and data quality tools like Apache Griffin, Deequ, or Great Expectations. Current skills in following technologies: Python Orchestration platforms: Airflow, Luigi, Databricks, or similar Relational databases: Postgres, MySQL, or equivalents MPP data systems: Snowflake, Redshift, Synapse, or similar Cloud platforms: AWS, Azure, or similar Version control (e.g., GitHub) and familiarity with deployment, CI/CD tools. Fluent with Agile processes and tools such as Jira or Pivotal Tracker Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus.
Posted 2 months ago
10 - 16 years
30 - 35 Lacs
Gurgaon
Hybrid
We are seeking an experienced Senior Business Analyst (Data Application & Integration) to drive key data and integration initiatives. The ideal candidate will have a strong business analysis background and a deep understanding of data applications, API integrations, and cloud-based platforms like Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Gather, document, and analyze business requirements for data application and integration projects. Work closely with business stakeholders to translate business needs into technical solutions. Design and oversee API integrations to ensure seamless data flow across platforms. Collaborate with cross-functional teams including developers, data engineers, and architects. Define and maintain data integration strategies, ensuring high availability and security. Work on Salesforce, Informatica, and Snowflake to streamline data management and analytics. Develop use cases, process flows, and documentation to support business and technical teams. Ensure compliance with data governance and security best practices. Act as a liaison between business teams and technical teams, providing insights and recommendations. Key Skills & Requirements: Strong expertise in business analysis methodologies and data-driven decision-making. Hands-on experience with API integration and data application management. Proficiency in Salesforce, Informatica, DBT, IICS, and Snowflake . Strong analytical and problem-solving skills. Ability to work in an Agile environment and collaborate with multi-functional teams. Excellent communication and stakeholder management skills.
Posted 2 months ago
5 - 10 years
5 - 15 Lacs
Bengaluru
Work from Office
Senior Data Engineer Job Description Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance Work in tandem with our engineering team to identify and implement the most optimal solutions Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures Able to manage deliverables in fast paced environments Areas of Expertise At least 5+ years of experience designing and development of data solutions in enterprise environment At least 2+ years experience on Snowflake Platform Strong hands on SQL and Python development Experience with designing and development data warehouses in Snowflake A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic Good understanding on Metadata and data lineage Hands on knowledge on SQL Analytical functions Strong knowledge and hands-on experience in Shell scripting, Java Scripting Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering. Good understanding and exposure to Git, Confluence and Jira Good problem solving and troubleshooting skills. Team player, collaborative approach and excellent communication skills
Posted 2 months ago
5 - 10 years
30 - 45 Lacs
Bengaluru
Hybrid
Data Engineer Key Responsibilities: Develop and implement efficient data pipelines and ETL processes to migrate and manage client, investment and accounting data in Snowflake. Work closely with the fund teams to understand data structures and business requirements, ensuring data accuracy and quality. Monitor and troubleshoot data pipelines, ensuring high availability and reliability of data systems. Optimize Snowflake database performance by designing scalable and cost-effective solutions. Design snowflake data model to effectively handle business needs. Work closely with AI Engineer and build data pipelines where necessary to support AI/ML projects. Skills Required: 5+ years of experience in IT working on Data projects with 3+ years of experience with Snowflake. Proficiency in Snowflake Data Cloud, including schema design, data partitioning, and query optimization. Strong SQL and Python skills, hands on experience working with python libraries such as pyspark, pandas, beautiful soup. Experience with ETL/ELT tools like Fivetran, Apache spark, dbt. Experience with RESTful APIs Familiarity workload automation and job scheduling tool such as Control M or Apache airflow. Familiar with data governance frameworks. Familiarity with Azure cloud
Posted 2 months ago
10 - 13 years
25 - 37 Lacs
Hyderabad
Work from Office
We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Total experience 10+years Experience in data engineering and database management. Expert knowledge in PostgreSQL (preferably cloud-hosted on AWS, Azure, or GCP). Experience with Snowflake Data Warehouse and strong SQL programming skills. Deep understanding of stored procedures, performance optimization, and handling large-scale data. Knowledge of ingestion techniques, data cleaning, de-duplication, and partitioning. Strong understanding of index design and performance tuning techniques. Familiarity with SQL security techniques, including data encryption, Transparent Data Encryption (TDE), signed stored procedures, and user permission assignments. Competence in data preparation and ETL tools to build and maintain data pipelines and flows. Experience in data integration by mapping various source platforms into Entity Relationship Models (ERMs). Exposure to source control systems like Git, Azure DevOps. Expertise in Python and Machine Learning (ML) model development. Experience in automated testing and test coverage tools. Hands-on experience in CI/CD automation tools Programming experience in Golang Understanding of Agile methodologies (Scrum, Kanban). Ability to collaborate with stakeholders across Executive, Product, Data, and Design teams. RESPONSIBILITIES: Design and maintain an optimal data pipeline architecture. Assemble large, complex data sets to meet functional and non-functional business requirements. Develop pipelines for data extraction, transformation, and loading (ETL) using SQL and cloud database technologies. Prepare and optimize ML models to improve business insights. Support stakeholders by resolving data-related technical issues and enhancing data infrastructure. Ensure data security across multiple data centers and regions, maintaining compliance with national and international data laws. Collaborate with data and analytics teams to enhance data systems functionality. Conduct exploratory data analysis to support database and dashboard development.
Posted 2 months ago
8 - 12 years
20 - 25 Lacs
Pune
Hybrid
So, what’s the role all about? The R&D Business Operations team is responsible for collecting, analyzing and managing R&D data, change requests and various cross unit projects and initiatives. In this role you will be responsible for dashboards used by the R&D group. You will create new data sources and reports connecting to a wide range of tools used in R&D while also being part of the ongoing management of KPI measurement. We are looking for a self-starter with bias for action, who is able to move new initiatives forward while collaborating with all types of stakeholders. How will you make an impact? Design, model and implement dashboards in Power BI Provide analytical and functional support Work effectively with a variety of stakeholders to ensure successful execution including R&D teams, product managers, IT, business operations etc. Write and maintain technical documentation Review and validate data End-to-end accountability for your domain including troubleshooting the reporting environment and training end users on new reports and dashboards Continually find opportunities for improvements and enhancements in existing implementations Have you got what it takes? Degree preferably in Industrial Engineering/Information Systems/Computer Science At least 8 years of overall experience Experience with PowerBi - Must Experience with SQL – Must Experience with Data Modeling – an advantage Experience in cloud data warehouse: Snowflake – an advantage Experience with R&D tools – an advantage Excellent problem-solving, organizational and analytical skills Exceptional communication skills A team player with a "can-do" attitude Fluent English What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 6544 Reporting into: Director of R&D Operations Role Type: Individual Contributor
Posted 2 months ago
6 - 11 years
25 - 35 Lacs
Pune, Delhi NCR, India
Hybrid
Exp - 6 to 10 Yrs Role - Data Modeller Position - Permanent Job Locations - DELHI/NCR (Remote), PUNE (Remote), CHENNAI (Hybrid), HYDERABAD (Hybrid), BANGALORE (Hybrid) Experience & Skills: 6+ years of experience in strong Data Modeling and Data Warehousing skills Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP)
Posted 2 months ago
10 - 18 years
30 - 45 Lacs
Pune, Delhi NCR, India
Hybrid
Exp - 10 to 18 Years Role - Data Modeling Architect / Senior Architect / Principal Architect Position - Permanent Locations - Hyderabad (Hybrid), Bangalore (Hybrid), Chennai (Hybrid), Pune (Remote till office opens), Delhi NCR (Remote till office opens) JD: 10+ years of experience in Data Warehousing & Data Modeling - Dimensional/Relational/Physical/Logical. Decent SQL knowledge Able to suggest modeling approaches & Solutioning for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Good experience in stakeholder management Decent communication and experience in leading the team
Posted 2 months ago
5 - 10 years
5 - 15 Lacs
Bengaluru
Hybrid
5 years of relevant experience with Airflow, Snowflake, Advanced DBT Concepts, AWS, Advanced SQL
Posted 2 months ago
10 - 15 years
35 - 40 Lacs
Mumbai, Bengaluru, Gurgaon
Work from Office
Data Strategy & Data Governance Manager Join our team in Technology Strategy for an exciting career opportunity to enable our most strategic clients to realize exceptional business value from technology Practice: Technology Strategy & Advisory, Capability Network I Areas of Work: Data Strategy I Level: Manager | Location: Bangalore/Gurgaon/Mumbai/Pune/Chennai/Hyderabad/Kolkata | Years of Exp: 10 to 15 years Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients' most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analytics"and the data that fuels it all"to power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for todays connected landscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to client's technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients A fair understanding of data platform strategy for data on cloud migrations, big data technologies, large scale data lake and DW on cloud solutions. Utilize strong expertise & certification in any of the Data & Analytics on Cloud platforms Google, Azure or AWS Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualifications Your experience counts! MBA from a tier 1 institute 5 7 years of Strategy Consulting experience at a consulting firm 3+ years of experience on projects showcasing skills across these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy At least 2 years of experience on architecting or designing solutions for any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience in one or more technologies in the data governance space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. 3+ years of experience in designing end-to-end Enterprise Data & Analytics Strategic Solutions leveraging Cloud & Non-Cloud platforms like AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera, Informatica, Snowflake, Palantir Deep Understanding of data supply chain and building value realization framework for data transformations 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & Analytics strategy recommendations as well as developing POCs Foundational understanding of data privacy is desired Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains CDMP Certification from DAMA preferred Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential
Posted 2 months ago
5 - 10 years
50 - 55 Lacs
Bengaluru
Hybrid
WHAT YOULL DO As a member of the team, you will be responsible for developing, testing, and deploying data-driven software products in AWS Cloud. You will work with a team consisting of a data scientist, an enterprise architect, data engineers and business users to enhance products feature sets. Qualification & Skills: Mandatory: Knowledge and experience in audit, expense and firm level data elements Knowledge and experience in audit and compliance products/processes End-to-end product development lifecycle knowledge/exposure Strong in AWS Cloud and associated services like Elastic Beanstalk, SageMaker, EFS, S3, IAM, Glue, Lambda, SQS, SNS, KMS, Encryption, Secret Manager Strong experience in Snowflake Database Operations Strong in SQL and Python Programming Language Strong experience in Web Development Framework (Django) Strong experience in React and associated framework (Next JS, Tailwind etc) Experience in CI/CD pipelines and DevOps methodology Experience in SonarQube integration and best practices Implementation of best security implementation for web applications and cloud infrastructure Knowledge of Wiz.io for security protocols related to AWS Cloud Platform Nice to Have: Knowledge of data architecture, data modeling, best practices and security policies in the Data Management space Basic data science knowledge preferred Experience in KNIME/Tableau/PowerBI Experience & Education: Between 5 to 15 years of IT experience Bachelors/masters degree from an accredited college/university in business related or technology related field
Posted 2 months ago
10 - 18 years
30 - 35 Lacs
Chennai
Work from Office
We are looking for people who have experience in digital implementations in cloud platforms, leading architecture design and discussions. ETL SME, SQLSnowflak,e and Data Engineering skills Alert monitoring, scheduling, and auditing knowledge Nice to have: Experience with agile,working incompliance regulated environments, exposure to manufacturing IIoT data 8-10 years of relevant experience
Posted 2 months ago
8 - 13 years
20 - 35 Lacs
Bengaluru
Hybrid
Job description Job Title: Azure Data Engineer Location: Bangalore-Hybrid Exp: 8+ Years Minimum Qualifications Degree in Analytics, Business Administration, Data Science or equivalent experience Creative problem solving skills Exemplary attention to detail, critical that you understand the data before finalizing a query Strong written & verbal communication skills Ensures queries are accurate (written correctly & provides the intended outcome) Skilled at creating documentation to run and troubleshoot the tools / reports you create Collaboration and networking ability 8+ years of experience using SQL & creating reports in Power BI Tools: SQL Snowflake Power BI Microsoft Excel Azure Data bricks, ADL,ADF DAX
Posted 2 months ago
6 - 10 years
8 - 12 Lacs
Bengaluru
Work from Office
About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions " underpinned by the world's largest delivery network " Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 492,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Global Network: Accenture Strategy shapes our clients' future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, every business is a digital business. Digital is changing the way organizations engage with their employees, business partners, customers, and communities how they manufacture and deliver products and services, and how they run their organizations. This is our unique differentiator. We seek people who recognize and understand the impact that digital, and technology have on every industry and every sector, and share our passion to shape unique strategies that allow our clients to succeed in this environment. To bring this global perspective to our clients, Accenture Strategys services include those provided by our Global Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Approximately 10,000 consultants are part of this rapidly expanding network, providing specialized and strategic industry and functional consulting expertise from key locations around the world. Our Global Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit . Practice Overview: Skill/Operating Group Technology Consulting Level Consultant Location Gurgaon/Mumbai/Bangalore/Kolkata/Pune Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise"strategic, industry, functional, technical"in a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Architect large scale data lake, DW, and Delta Lake on cloud solutions using AWS, Azure, GCP, Ali Cloud, Snowflake, Hadoop, or Cloudera Design Data Mesh Strategy and Architecture Build strategy and roadmap for data migration to cloud Establish Data Governance Strategy & Operating Model Implementing programs/interventions that prepare the organization for implementation of new business processes Deep understanding of Data and Analytics platforms, data integration w/ cloud Provide thought leadership to the downstream teams for developing offerings and assets Identifying, assessing, and solving complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Driving enterprise business, application, and integration architecture Helping solve key business problems and challenges by enabling a cloud-based architecture transformation, painting a picture of, and charting a journey from the current state to a "to-be" enterprise environment Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients Qualifications Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 6-10 years of large-scale consulting experience and/or working with hi tech companies in data architecture, data governance, data mesh, data security and management. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics Experience: We are looking for experienced professionals with Data strategy, data architecture, data on cloud, data modernization, data operating model and data security experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space:AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Strategy - Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Data Transformation Lead - Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to senior stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs
Posted 2 months ago
3 - 8 years
5 - 10 Lacs
Chennai
Work from Office
The impact you will have in this role : Being a member of the Automation Enablement means you will be a part of a technology team with a rich diverse skill sets and a great enthusiastic committed team. Whether its Terraform, Python, Shell Scripting, Ansible development which is like enhancements/customizations on Jenkins pipeline using groovy/python, similar experience around CI/CD tools. we are there for each other collaborating and helping each other to achieve the common goal. We are embarking on an incredible multi-year transformation journey and we are looking for best-of-breed software engineers to join us on this journey. In this role would be responsible to contribute towards new project implementations related to building of a new automation enablement development platform that supports IAC Mandate program. Jenkins platform that is the backbone for all systems and applications at DTCC. There is also significant opportunity for advancement and growth in this role based on your performance and contribution towards the organization goals. Your Primary Responsibilities : Automation Enablement team, focusing on development of people, process and tools to support build and delivery automation throughout the enterprise. This role will involve development, maintenance and support of our IAC development, maintenance and support of our development pipelines, which support our continuous Integration, Automated Provisioning and Continuous Delivery capabilities across both our cloud and On-Prem Platforms. Work with stakeholders to define clear project objectives and requirements, agree on priorities, communicate progress updates and demonstrations, resolve issues and conflicts, and provide responsive change management. Develop tools and processes to reduce deployment obstacles, improve developer productivity, reduce risk and lead to faster time to market. Developing subject matter expertise on one or more programming languages, vendor products, DTCC applications, data structures, business lines. Is aware of frameworks leverages frameworks under guidance of more senior technical staff Security implements solutions and executes test plans working with more senior technical staff to validate security requirements Standards is aware of technology standards and understands technical solutions need to be consistent with them Be able to independently follow design and code standards, contributing to continuous improvement discussions Documentation develops and maintains system documentation Write clean, self-documenting code following standard methodologies on coding incorporating plan, design, coding, deploying and testing. Collaborate with the Senior Automation Engineers and other technical contributors at all levels during the application development. Participate in code reviews, Sprint meetings and retrospectives. Provide ongoing maintenance support for the applications during post-deployment support phase. Foster a risk management culture through implementation and demonstration of processes and procedures which identify and mitigate risk Qualifications Expertise in working in large, collaborative teams to achieve organizational goals Strong Communication & collaboration skills Be a self-starter providing creative and innovative ideas or solutions continuously improving the technological offering. **NOTE: Responsibilities of this role are not limited to the details above. ** Qualifications: Minimum of 3 years of related experience. Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Hands-on experience with **Ansible** for configuration management and automation Strong Knowledge on CI/CD development which is like enhancements/customizations on Jenkin pipeline using Python Extensive experience with Python, Java Hands-on experience with Scripting like Java Script, Shell Script, Bash Experience in product life cycle management & agile based delivery Familiarity with CI/CD tools & process (GIT/Bitbucket, Maven, Ant, Jenkins, Sonar) Familiarity Agile Methodology and Tools (JIRA) Proven Understanding with Cloud deployments (Private Cloud / AWS / Azure) Familiarity with Database Skillsets (PostgreSQL, Snowflake)
Posted 2 months ago
6 - 10 years
15 - 30 Lacs
Bengaluru
Work from Office
What YOU WILL EXPERIENCE IN THIS POSITION: As an nVent Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines and platforms that enable data-driven decision making and insights across the organization. You will work with a variety of data sources, technologies, emerging tools, and frameworks. You will collaborate with Business Stakeholders, Analytics Engineers, Data Engineers, and Data Architects to understand data requirements and deliver solutions. Key Responsibilities Develop and maintain scalable, reliable, and secure data pipelines and platforms using cloud-based technologies and best practices. (AWS, Snowflake, dbt, HVR, Matillion) Integrate and transform data from various sources, such as SAP, SQL Server, S3, Infor M3, JDE, Salesforce, and APIs. Optimize and monitor the performance, quality, and availability of data pipelines and platforms, ensuring data integrity and consistency. Support data analysis and reporting by creating and maintaining data models, schemas, views, and dashboards using tools like Power BI, Tableau, or SSRS. Collaborate with data engineers, data analysts, data scientists, and business stakeholders to understand and translate business requirements into technical specifications and solutions. Document and communicate data engineering processes, standards, and best practices to ensure alignment and knowledge sharing across the team and the organization. Research and evaluate new data technologies and trends and provide recommendations for continuous improvement and innovation. Collaborate with stakeholders across the business to understand their needs and challenges, translating them into clear, concise, and technically feasible data analysis requests. You have: A Bachelors or Masters degree in Computer Science, Engineering or a related field. Previous experience in or with 8+ years experience as a Software Engineer, Data Engineer, or Data Analyst Experience developing end-to-end technical solutions and sustaining solutions in production, ensuring performance, security, scalability, and robust data integration. Experience in cloud data platforms like Snowflake, AWS , Azure, Databricks Data integration and transformation using tools such as dbt, Matillion, HVR, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc, or other data integration technologies Cloud and Data Storage: S3, ADLS, ElasticSearch, Cassandra or other NoSQL storage systems Experience with agile project methodologies and a collaborative work style. Experience in SAP BW & SAP HANA is a plus. Reporting and dashboarding tools like SSRS, BEx, Power BI, Tableau is a plus Microsofts SQL Server data platform technology (SSIS, SSAS) is a plus Skills Programming expertise in SQL, Python, PySpark or Scala. Ability to write, debug, and optimize SQL queries. Strong analytical skills and the ability to combine data from different sources. Excellent communication and presentation skills, with the ability to explain complex data to non-technical audiences. User-facing written and verbal communication skills and experience.
Posted 2 months ago
4 - 8 years
5 - 12 Lacs
Mumbai Suburbs
Work from Office
4+ years of experience working with Snowflake and cloud data platforms. Strong expertise in SQL. Experience with ETL/ELT tools. Familiarity with cloud platforms (AWS, Azure, or GCP) Immediate joiner within 2 weeks
Posted 2 months ago
2 - 7 years
6 - 16 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements Master Of Technology,MCA,MSc,Bachelor of Engineering,Bachelor Of Technology,BCA,BSc Role & responsibilities Preferred candidate profile Perks and benefits
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2