Home
Jobs

244 Data Transformation Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 15 years

30 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Join our team in Technology Strategy for an exciting career opportunity to enable our most strategic clients to realize exceptional business value from technology Practice: Technology Strategy & Advisory, Capability Network I Areas of Work: Data & AI Strategy I Level: Manager | Location: Bangalore/Gurgaon/Mumbai/Pune/Chennai/Hyderabad/Kolkata | Years of Exp: 10 to 15 years Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling AI"and the data that fuels it all"to power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of AI potential and development of use cases that can transform business Proof of Concepts :Help design POCs and high-level solutions using AI or Gen AI analytics solutions to derive insights from data. Formulation of Guiding Principles and Components: Assessing impact to clients technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing AI & Gen AI products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & AI and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their AI problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide AI & Gen AI strategy for our clients Through your expertise and experience, guide your team to suggest the right solutions to meet the needs of clients and help draw up practical implementation road maps that position them for long-term success Benchmark against global research benchmarks and leading industry peers to understand current & recommend AI & Gen AI solutions Conduct discovery workshops and design sessions to elicit AI & Gen AI opportunities & client pain areas. Utilize strong expertise & certification in any of the AI & Gen AI Cloud platforms - Google, Azure or AWS, in areas such as Machine Learning, NLP, Generative AI etc. Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Have a deep understanding of the Responsible AI & Gen AI framework as well as tools to engage the client in meaningful discussions and steer towards the right recommendation. Define an AI use case driven value realization framework and define business cases relevant to the client industry Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Read more about us. Your experience counts! MBA from a tier 1 institute 5- 7 years of Strategy Consulting experience at a consulting firm 3+ years of experience writing business cases (quantitative and qualitative) to support strategic business initiatives or Data & AI transformation 3+ years of experience in designing and building end-to-end Enterprise AI and Gen AI Strategic Solutions using Cloud & Non-Cloud platforms like Google Vertex and Gemini, Azure AI, Open AI, Co-Pilot, Amazon Bedrock, IBM Watson, DataRobot etc. Excellent understanding of Traditional AI, Gen AI, Agentic AI and Advanced Cognitive methods to derive insights and actions Good working knowledge in formulation of relevant guiding principles and platform components with expertise in Responsible AI frameworks is highly preferred 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & AI strategy recommendations as well as developing POCs Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains Cloud AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential Whats in it for you? An opportunity to work on with key G2000 clients Potential to with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed into everything"from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your to grow your skills, industry knowledge and capabilities Opportunity to thrive in a that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions " underpinned by the worlds largest delivery network " Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategys services include those provided by our Capability Network- a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit | At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, . Qualifications MBA from a tier 1 institute 5 7 years of Strategy Consulting experience at a consulting firm 3+ years of experience writing business cases (quantitative and qualitative) to support strategic business initiatives or Data & AI transformation 3+ years of experience in designing and building end-to-end Enterprise AI and Gen AI Strategic Solutions using Cloud & Non-Cloud platforms like Google Vertex and Gemini, Azure AI, Open AI, Co-Pilot, Amazon Bedrock, IBM Watson, DataRobot etc. Excellent understanding of Traditional AI, Gen AI, Agentic AI and Advanced Cognitive methods to derive insights and actions Good working knowledge in formulation of relevant guiding principles and platform components with expertise in Responsible AI frameworks is highly preferred 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & AI strategy recommendations as well as developing POCs Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains Cloud AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential

Posted 1 month ago

Apply

9 - 11 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Data Engineering, Cloud Data Migration Minimum 9 year(s) of experience is required Educational Qualification : BE or BTech must Project Role :Data Platform Engineer Project Role Description :Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have Skills :Data Modeling Techniques and Methodologies, SSI: NON SSI:Good to Have Skills :SSI:Data Engineering, Cloud Data Migration NON SSI :Job Requirements :Key Responsibilities :1Drive discussions with clients deal teams to understand business requirements, how Industry Data Model fits in implementation and solutioning 2Develop the solution blueprint and scoping, estimation, staffing for delivery project and solutioning 3Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions 4Good to have Data Vault,Cloud DB design,Graph data modeling, Ontology, Data Engineering,Data Lake design Technical Experience :1 9plus year overall exp,4plus Data Modeling,Cloud DB Model,3NF,Dimensional,Conversion of RDBMS data model to Graph Data ModelInstrumental in DB design through all stages of Data Model 2 Exp on at least one Cloud DB Design work must be familiar with Data Architecture Principles Professional Attributes :1Strong requirement analysis and technical solutioning skill in Data and Analytics 2Excellent writing, communication and presentation skills 3Eagerness to learn and develop self on an ongoing basis 4Excellent client facing and interpersonal skills Educational Qualification:BE or BTech mustAdditional Info :Exp in estimation,PoVs,Solution Approach creation Exp on data transformation,analytic projects,DWH Qualification BE or BTech must

Posted 1 month ago

Apply

1 - 2 years

3 - 4 Lacs

Bengaluru

Work from Office

Naukri logo

Dear Candidate, Hiring Data Analyst for MNC-Bangalore Role: Data Analyst Contract: 6 Months (conversion based on performance) Exp: 6 Months -2 Years Work Mode: Hybrid Notice: Imm - 30 days Location: Only bangalore candidates Shift: General Skills Required: Experience in finance domain (Preferably) Technical Skills: PowerBi, SQL, Python, Tableau Core Skills: Data Management, Data Transformation, Data Reporting, Data visualization, Dashboarding. Interested, Please share updated CV to arthie.m@orcapod.work

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Workday Prism Analytics Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead the design and development of applications. Implement best practices for application development. Conduct code reviews and ensure code quality. Stay updated on industry trends and technologies. Professional & Technical Skills: Must To Have Skills:Proficiency in Workday Prism Analytics. Strong understanding of data analytics and visualization. Experience with data modeling and data transformation. Hands-on experience in building and configuring applications. Knowledge of cloud-based application development. Additional Information: The candidate should have a minimum of 5 years of experience in Workday Prism Analytics. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 9 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Position Summary Looking for a Salesforce Data Cloud Engineer to design, implement, and manage data integrations and solutions using Salesforce Data Cloud (formerly Salesforce CDP). This role is essential for building a unified, 360-degree view of the customer by integrating and harmonizing data across platforms. Job Responsibilities Consolidate the Customer data to create a Unified Customer profile Design and implement data ingestion pipelines into Salesforce Data Cloud from internal and third-party systems . Work with stakeholders to define Customer 360 data model requirements, identity resolution rules, and calculated insights. Configure and manage the Data Cloud environment, including data streams, data bundles, and harmonization. Implement identity resolution, micro segmentation, and activation strategies. Collaborate with Salesforce Marketing Cloud, to enable real-time personalization and journey orchestration. Ensure data governance, and platform security. Monitor data quality, ingestion jobs, and overall platform performance. Education BE/B.Tech in Computer or IT Master of Computer Application Work Experience Overall experience of minimum 10 years in Data Management and Data Engineering role, with a minimum experience of 3 years as Salesforce Data Cloud Data Engineer Hands-on experience with Salesforce Data Cloud (CDP), including data ingestion, harmonization, and segmentation. Proficient in working with large datasets, data modeling, and ETL/ELT processes. Understanding of Salesforce core clouds (Sales, Service, Marketing) and how they integrate with Data Cloud. Experience with Salesforce tools such as Marketing Cloud. Strong knowledge of SQL, JSON, Apache Iceberg and data transformation logic. Familiarity with identity resolution and customer 360 data unification concepts. Salesforce certifications (e.g., Salesforce Data Cloud Accredited Professional, Salesforce Administrator, Platform App Builder). Experience with CDP platforms other than Salesforce (e.g., Segment, Adobe Experience Platform (Good to have)). Experience with cloud data storage and processing tools (Azure, Snowflake, etc.). Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Lifescience Knowledge Azure SQL SQL Databricks

Posted 1 month ago

Apply

7 - 10 years

17 - 22 Lacs

Mumbai

Work from Office

Naukri logo

Position Overview: The Microsoft Cloud Data Engineering Lead role is ideal for an experienced Microsoft Cloud Data Engineer who will architect, build, and optimize data platforms using Microsoft Azure technologies. The role requires the candidate to have deep technical expertise in Azure data services, strong leadership capabilities, and a passion for building scalable, secure, and high-performance data ecosystems. Key Responsibilities: Lead the design, development, and deployment of enterprise-scale data pipelines and architectures on Microsoft Azure. Manage and mentor a team of data engineers, promoting best practices in cloud engineering, data modeling, and DevOps. Architect and maintain data platforms using Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory, Azure Databricks, and Azure SQL/SQL MI. Develop robust ETL/ELT workflows for structured and unstructured data using Azure Data Factory and related tools. Collaborate with data scientists, analysts, and business units to deliver data solutions supporting advanced analytics, BI, and operational use cases. Implement data governance, quality, and security frameworks, leveraging tools such as Azure Purview and Azure Key Vault. Drive automation and infrastructure-as-code practices using Bicep, ARM templates, or Terraform with Azure DevOps or GitHub Actions. Ensure performance optimization and cost-efficiency across data pipelines and cloud environments. Stay current with Microsoft cloud advancements and help shape cloud strategy and data architecture roadmaps. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with Microsoft Azure . Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert knowledge of Azure Data Lake, Synapse Analytics, Data Factory, Databricks, and Azure SQL-based technologies. Proficiency in SQL, Python, and/or Spark for data transformation and analysis. Strong understanding of data governance, security, compliance (e.g., GDPR, PCIDSS), and privacy in cloud environments. Experience leading data engineering teams or cloud data projects from design to delivery. Familiarity with CI/CD pipelines, infrastructure as code, and DevOps practices within the Azure ecosystem Familiarity with Power BI and integration of data pipelines with BI/reporting tools Certifications : Microsoft Certified: Azure Data Engineer Associate or Azure Solutions Architect Expert.

Posted 1 month ago

Apply

8 - 10 years

25 - 27 Lacs

Mumbai

Work from Office

Naukri logo

EXP- 8 yrs Minimum 5 yrs in Apache Camel. We are looking for a skilled Java Developer with strong experience in Apache Camel integration frameworks. The ideal candidate will have hands-on experience in designing, developing, and maintaining integration solutions using Java and Camel, with a solid understanding of enterprise integration patterns and message-driven architectures. Key Responsibilities: Design, develop, and maintain integration solutions using Java and Apache Camel. Build and manage routes for data transformation, mediation, and orchestration. Work with various messaging systems such as ActiveMQ, Kafka, RabbitMQ, etc. Integrate APIs, web services (REST/SOAP), and databases. Write unit and integration tests to ensure code quality and performance. Participate in code reviews and maintain coding best practices. Collaborate with architects, QA, and DevOps teams to deliver high-quality software. Monitor and debug integration flows and resolve performance issues.

Posted 1 month ago

Apply

8 - 12 years

25 - 30 Lacs

Mumbai

Work from Office

Naukri logo

Minimum 5 yrs in Apache Camel. We are looking for a skilled Java Developer with strong experience in Apache Camel integration frameworks. The ideal candidate will have hands-on experience in designing, developing, and maintaining integration solutions using Java and Camel, with a solid understanding of enterprise integration patterns and message-driven architectures. Key Responsibilities: Design, develop, and maintain integration solutions using Java and Apache Camel. Build and manage routes for data transformation, mediation, and orchestration. Work with various messaging systems such as ActiveMQ, Kafka, RabbitMQ, etc. Integrate APIs, web services (REST/SOAP), and databases. Write unit and integration tests to ensure code quality and performance. Participate in code reviews and maintain coding best practices. Collaborate with architects, QA, and DevOps teams to deliver high-quality software. Monitor and debug integration flows and resolve performance issues.

Posted 1 month ago

Apply

3 - 7 years

14 - 24 Lacs

Bengaluru

Hybrid

Naukri logo

Role Summary: Construction of modelling and monitoring bases and data quality management for regulatory credit risk (PD, LGD and EAD) and provisio:n (IFRS9) models. Role Description Apply data wrangling and credit risk domain expertise in creation of modelling and monitoring databases and evaluating the quality of the same. Manage intermediate level case studies and challenges on around data collection, wrangling and data quality with minimal supervision. Perform root-cause analysis and resolve any team hurdles around data quality issues. Identify, develop and implement process and project enhancements. Know upstream and downstream processes of modelling data creation. Lead process governance meetings with stakeholders - modelling and IT teams. Represent and contribute to internal forums, and innovation. Profile Required: Understanding of Design and Development of Data Analytics. Good experience on SQL Queries and sound programming knowledge in analytical tools like SAS, Python, PySpark. Good to have awareness of Big Data and cloud based BI stack, emerging BI trend. Understanding of designing, defining and documenting solution architecture. Ability in gathering client requirements and work through specifications and develop solutions where appropriate in line with project documentation. Working within time guideline. Knowledge of credit risk modelling such as PD, LGD , CCF and IFRS9. Specific Context Client Focus, Team Sprit, Commitment, Responsibility, Ownership, and Innovation Environment At Socit Gnrale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious. Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future. Creating, daring, innovating and taking action are part of our DNA. If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating? You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices and sharing their skills with charities. There are many ways to get involved. We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies. They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection.

Posted 1 month ago

Apply

8 - 13 years

25 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

Over all 8+ years of solid experience in data projects. Excellent Design, develop, and maintain robust ETL/ELT pipelines for data ingestion, transformation, and storage. Proficient in SQL and must worked on complex joins, Subqueries, functions, procedure Able to perform SQL tunning and query optimization without support. Design, develop, and maintain ETL pipelines using Databricks, PySpark to extract, transform, and load data from various sources. Must have good working experience on Delta tables, deduplication, merging with terabyte of data set Optimize and fine-tune existing ETL workflows for performance and scalability. Excellent knowledge in dimensional modelling and Data Warehouse Must have experience on working with large data set Experience working with batch and real-time data processing (Good to have). Implemented data validation, quality checks , and ensure adherence to security and compliance standards. Ability to develop reliable, secure, compliant data processing systems. Work closely with cross-functional teams to support data analytics, reporting, and business intelligence initiatives. One should be self-driven and work independently without support.

Posted 1 month ago

Apply

1 - 5 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title Data Engineer for Private Bank One Data Platform on Google Cloud Corporate TitleAssociate LocationPune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

2 - 5 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 318492 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake, Python, Airflow Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job DutiesTeam Overview The Controls Engineering, Measurement and Analytics (CEMA) department is responsible for Cyber Risk and Control assessment, management, monitoring, and reporting capabilities across Technology, resulting in risk reduction and better oversight of the technology risk landscape of the firm. Our work is always client focused, our engineers are problem-solvers and innovators. We seek exceptional technologists to help deliver solutions on our user-facing applications, data stores, and reporting and metric platforms while being cloud-centric, leveraging multi-tier architectures and aligned with our DevOps and Agile strategies. We are in the process of modernizing our technology stack across multiple platforms with the goal of building scalable, front-to-back assessment, measurement and monitoring systems using the latest cloud, web, and data technologies. We are looking for someone with a systematic problem-solving approach, coupled with a sense of ownership and drive. The successful candidate will be able to influence and collaborate globally. They should be a strong team-player, have an entrepreneurial approach, push innovative ideas while appropriately considering risk, and adapt in a fast-paced changing environment. Role Summary As an ETL / Data Engineer, you will be a member of the CEDAR / C3 Data Warehouse team, with a focus on sourcing and storing data from various technology platforms across the firm into a centralized data platform used to build various reporting and analytics solutions for the Technology Risk functions within Morgan Stanley. In this role you will be primarily responsible for the development of data pipelines, database views, and stored procedures, in addition to performing technical data analysis, and monitoring and tuning queries and data loads. You will be working closely with data providers, data analysts, data developers, and data analytics teams to facilitate the implementation of client-specific business requirements and requests. KEY RESPONSIBILITIES: To develop ETLs, stored procedures, triggers, and views on our existing DB2-based Data Warehouse and on our new Snowflake-based Data Warehouse. To perform data profiling and technical analysis on source system data to ensure that source system data can be integrated and represented properly in our models. To monitor the performance of queries and data loads and perform tuning as necessary. To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues. Minimum Skills Required Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required. At least 5+ years of experience in data development and solutions in highly complex data environments with large data volumes. At least 5+ years of experience developing complex ETLs with Informatica PowerCenter. At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2. Experience with performance tuning DB2 tables, queries, and stored procedures. An understanding of E-R data models (conceptual, logical, and physical). Strong understanding of advanced data warehouse concepts (Factless Fact Tables, Temporal \ Bi-Temporal models, etc.). Experience with Python a plus. Experience with developing data transformations using DBT a plus. Experience with Snowflake a plus. Experience with Airflow a plus. Experience with using Spark (PySpark) for data loading and complex transformations a plus. Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions. Strong communication skills both verbal and written. Capable of coll About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Cloud, Data Warehouse, Database, Computer Science, Quality Assurance, Technology

Posted 1 month ago

Apply

2 - 5 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 318488 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake, Python, Airflow Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job DutiesTeam Overview The Controls Engineering, Measurement and Analytics (CEMA) department is responsible for Cyber Risk and Control assessment, management, monitoring, and reporting capabilities across Technology, resulting in risk reduction and better oversight of the technology risk landscape of the firm. Our work is always client focused, our engineers are problem-solvers and innovators. We seek exceptional technologists to help deliver solutions on our user-facing applications, data stores, and reporting and metric platforms while being cloud-centric, leveraging multi-tier architectures and aligned with our DevOps and Agile strategies. We are in the process of modernizing our technology stack across multiple platforms with the goal of building scalable, front-to-back assessment, measurement and monitoring systems using the latest cloud, web, and data technologies. We are looking for someone with a systematic problem-solving approach, coupled with a sense of ownership and drive. The successful candidate will be able to influence and collaborate globally. They should be a strong team-player, have an entrepreneurial approach, push innovative ideas while appropriately considering risk, and adapt in a fast-paced changing environment. Role Summary As an ETL / Data Engineer, you will be a member of the CEDAR / C3 Data Warehouse team, with a focus on sourcing and storing data from various technology platforms across the firm into a centralized data platform used to build various reporting and analytics solutions for the Technology Risk functions within Morgan Stanley. In this role you will be primarily responsible for the development of data pipelines, database views, and stored procedures, in addition to performing technical data analysis, and monitoring and tuning queries and data loads. You will be working closely with data providers, data analysts, data developers, and data analytics teams to facilitate the implementation of client-specific business requirements and requests. KEY RESPONSIBILITIES: To develop ETLs, stored procedures, triggers, and views on our existing DB2-based Data Warehouse and on our new Snowflake-based Data Warehouse. To perform data profiling and technical analysis on source system data to ensure that source system data can be integrated and represented properly in our models. To monitor the performance of queries and data loads and perform tuning as necessary. To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues. Minimum Skills Required Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required. At least 5+ years of experience in data development and solutions in highly complex data environments with large data volumes. At least 5+ years of experience developing complex ETLs with Informatica PowerCenter. At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2. Experience with performance tuning DB2 tables, queries, and stored procedures. An understanding of E-R data models (conceptual, logical, and physical). Strong understanding of advanced data warehouse concepts (Factless Fact Tables, Temporal \ Bi-Temporal models, etc.). Experience with Python a plus. Experience with developing data transformations using DBT a plus. Experience with Snowflake a plus. Experience with Airflow a plus. Experience with using Spark (PySpark) for data loading and complex transformations a plus. Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions. Strong communication skills both verbal and written. Capable of coll About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Cloud, Data Warehouse, Computer Science, Database, SQL, Technology

Posted 1 month ago

Apply

1 - 4 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an MDM Associate Data Engineer with 2 –5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong e xperience with Databricks and AWS architecture. Must have k nowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having e xperience with Informatica or Reltio MDM platforms will be preferred . Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

- 2 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an MDM Associate Data Engineer with 2 –5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong experience with Databricks and AWS architecture. Must have knowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having experience with Informatica or Reltio MDM platforms will be preferred. Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

5 - 10 years

15 - 30 Lacs

Chennai

Remote

Naukri logo

Design, develop, and maintain data solutions focused on importing, processing, and transforming client CMS data for AI system Pipeline optimisation, collaboration, code development, api integration, cloud data management. Exp. in Python, Node.js, PHP

Posted 1 month ago

Apply

3 - 8 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Name of Organization: Jarus Technologies (India) Pvt. Ltd. Organization Website: www.jarustech.com Position: Senior Software Engineer - Data warehouse Domain Knowledge: Insurance (Mandatory) Job Type: Permanent Location: Hyderabad - IDA Cherlapally, ECIL and Divyasree Trinity, Hi-Tech City. Experience: 3+ years Education: B. E. / B. Tech. / M. C. A. Resource Availability: Immediately or a maximum period of 30 days. Technical Skills: • Strong knowledge of data warehousing concepts and technologies. • Proficiency in SQL and other database languages. • Experience with ETL tools (e.g., Informatica, Talend, SSIS). • Familiarity with data modelling techniques. • Experience in building dimensional data modelling objects, dimensions, and facts. • Experience with cloud-based data warehouse platforms (e.g., AWS Redshift, Azure Synapse, Google Big Query). • Familiar with optimizing SQL queries and improving ETL processes for better performance. • Knowledge of data transformation, cleansing, and validation techniques. Experience with incremental loads, change data capture (CDC) and data scheduling. • • Comfortable with version control systems like GIT. • Familiar with BI tools like Power BI for visualization and reporting. Responsibilities: Design, develop and maintain data warehouse systems and ETL (Extract, Transform, Load) processes. • • Develop and optimize data models and schemas to support business needs. • Design and implement data warehouse architectures, including physical and logical designs. • Design and develop dimensions, facts and bridges. • Ensure data quality and integrity throughout the ETL process. • Design and implement relational and multidimensional database structures. • Understand data structures and fundamental design principles of data warehouses. • Analyze and modify data structures to adapt them to business needs. • Identify and resolve data quality issues and data warehouse problems. • Debug ETL processes and data warehouse queries. Communication skills: • Good communication skills to interact with customer • Ability to understand requirements for implementing an insurance warehouse system

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

About The Role The candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. He/she must be able to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, he/she must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Role and responsibilities: Process mapping and identifying non value add steps / friction points in the process Discover, monitor and improve processes by extracting and analysing knowledge from the event logs in Process Mining/Celonis tool. Work alongside both technical and non-technical stakeholders to understand business challenges to help design process mining initiatives and prioritize the requests. Act as customers key contact and guide them through revealing process trends, inefficiencies & bottlenecks in the business process. Support validation of data (counts, values between source systems and Celonis). Work on process insights by creating KPIs and actions, identify process inefficiencies, and understand the root causes. Develop workflows to monitor processes, detect anomalies and turn those insights into real-time automated preventive or corrective actions using Action-engine, Action-flows and other capabilities. Technical and Functional Skills: Bachelors degree in Computer Science with 3+ years of work experience in Data Analytics, data mining & Data Transformation. Very proficient in Celonis, should be able to build, manage, and extract value from Celonis models for various use casesAdding or modifying data sources, Creating automated alerts, Action engine, Transformation center, Celonis ML workbench. Experience in SQL / PQL scripting & knowledge of data mining, should apply complex queries to build the transformation e.g. joins, union, windows functions etc. Knowledge of process improvement techniques / tools and Process Mining / Analytics. Basic knowledge of Python Scripting, should be knowing about (Numpy, Pandas, Seaborn, Matplotlib, SKLearn etc) Experience in BI tools (e.g., Tableau, Power BI etc.)- Nice to have Strong communicationand presentation skills. Understanding of business processes.

Posted 1 month ago

Apply

3 - 8 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

The Impact you will have in this role: Seeking a skilled Talend Developer with expertise in Power BI development and SQL Server to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes using Talend, creating insightful data visualizations with Power BI, and is an expert in writing stored procedures/queries on MS SQL Server databases. What You'll Do: Design, develop, and maintain ETL processes using Talend to extract, transform, and load data from various sources. Create and maintain data visualizations and dashboards using Power BI to provide actionable insights to stakeholders. Write high performance queries on SQL Server databases, ensuring data integrity, performance, and security Collaborate with cross-functional teams to gather requirements, design solutions, and implement data integration and reporting solutions. Troubleshoot and resolve issues related to ETL processes, data visualizations, and database performance Collaborates with other team members and analysts through the delivery cycle. Participates in an Agile delivery team that builds high quality and scalable work products. Supports production releases and maintenance windows working with the Operations team Qualifications: Bachelors degree in computer science, Information Technology, or a related field. Talents Needed for Success: Min 3+ years in writing ETL processes Proven experience as a Talend Developer, with a strong understanding of ETL processes and data integration. Proficiency in Power BI development, including creating dashboards, reports, and data models. Expertise in SQL Server, including database design, optimization, and performance tuning Strong understanding of agile processes (Kanban and Scrum) and a working knowledge of JIRA is required Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Qualifications needed for Success: Talend Expertise : Proficiency in using Talend Studio for data integration, data quality, files manipulation. This includes designing and developing ETL processes, creating and managing Talend jobs, and using Talend components for data transformation and integration Data Integration Knowledge in Talend : Understanding of data integration concepts and best practices. This includes experience with data extraction, transformation, and loading (ETL) processes, as well as knowledge of data warehousing and data modeling. Database Skills : Proficiency in working with various databases, including MS SQL and/or Oracle databases. This includes writing complex SQL queries, understanding database schemas, and performing data migrations. Version Control and Collaboration : Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). This is important for managing code changes, collaborating with team members, and tracking project progress. Job Scheduling and Automation : Experience with job scheduling and automation tools. This includes setting up and managing Talend jobs using schedulers like Talend Administration Center (TAC), Autosys or third-party tools to automate ETL workflows. Data Visualization : Ability to create visually appealing and insightful reports and dashboards. This involves selecting appropriate visualizations, designing layouts, and using custom visuals when necessary using Power BI Power Query : Expertise in using Power Query for data transformation and preparation. This involves cleaning, merging, and shaping data from various source Expertise in scripting languages such as Python, and Shell/Batch programming is a plus

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies