Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
9 - 13 Lacs
Gurugram
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
5.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
As a Senior Backend /Lead Development Engineer you will be involved in developing automation solutions to provision and manage any infrastructure across your organization. Being developer you will be leveraging capabilities of Terrafrom and Cloud offering to drive infrastructure as code capabilities for the IBM z/OS platform. You will Work closely with frontend engineers as part of a full-stack team, collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhancing the experience of the offering. Required education Bachelor's Degree Required technical and professional expertise * 10+ years of Software development experience with zOS or zOS Sub-systems. * 8+ years Professional experience developing with Golang, Python and Ruby * Hands-on experience with z/OS system programming or administration experience * Experience with Terraform key features like Infrastructure as a code, change automation, auto scaling. * Experience working with cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. * Cloud-native mindset and solid understanding of DevOps principles in a cloud environment * Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. * Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). * Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. * Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. * Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. * Strong analytical, debugging and problem solving skills to analyse issues and defects reported by customer-facing and test teams. * Proficient in source control management tools (GitHub, ) and with Agile Life Cycle Management tools. * Soft Skills: Strong communication, collaboration, self-organization, self-study, and the ability to accept and respond constructively to critical feedback.
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Your role will be pivotal in driving innovation and efficiency within the team, while also maintaining open lines of communication with stakeholders to keep them informed of progress and developments. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of applications.- Familiarity with data warehousing concepts and methodologies.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & ResponsibilitiesData Integration:Develop and implement data pipelines using Azure Data Factory, Azure Fabric, and MDMF(Meta data Management framework) to ingest, transform, and store data from various sources. Data Modeling:Maintain data models, ensuring data quality and consistency across different databases and systems. Database Management:Manage Azure SQL Databases and other storage solutions to optimize performance and scalability. ETL Processes:Design and optimize ETL (Extract, Transform, Load) processes to ensure efficient data flow and availability for analytics and reporting. Collaboration:Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide actionable insights. Documentation:Maintain clear documentation of data architectures, data flows, and pipeline processes. Professional & Technical Skills: Proficiency in Azure services such as Azure Data Factory,Azure Fabric, MDMF Azure SQL Database. Good knowledge of SQL and experience with programming languages such as Python and Pyspark. Familiarity with data modeling techniques and data warehousing concepts. Experience with cloud architecture and data architecture best practices. Understanding of data governance and security principles. Excellent problem-solving skills and attention to detail. Strong communication skills for collaborating with technical and non-technical stakeholders. Additional Information:- The candidate should have a minimum of 3 years of experience in Microsoft Azure Data Services.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 7.0 years
9 - 14 Lacs
Mumbai
Work from Office
As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution.Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Configure Datastax Cassandra as per requirement of the project solution Design the database system specific to Cassandra in consultation with the data modelers, data architects and etl specialists as well as the microservices/ functional specialists. Thereby produce an effective database system in Cassandra according to the solution & client's needs and specifications. Interface with functional & data teams to ensure the integrations with other functional and data systems are working correctly and as designed. Participate in responsible or supporting roles in different tests or UAT that involve the DataStax Cassandra database. The role will also need to ensure that the Cassandra database is performing and error free. This will involve troubleshooting errors and performance issues and resolution of the same as well as plan for further database improvement. Ensure the database documentation & operation manual is up to date and usable. Preferred technical and professional experience Has expertise, experience and deep knowledge in the configuration, design, troubleshooting of NoSQL server software and related products on Cloud, specifically DataStax Cassandra. Has knowledge/ experience in other NoSQl/ Cloud database. Installs, configures and upgrades RDBMS or NoSQL server software and related products on Cloud.
Posted 1 month ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP S/4HANA for Product Compliance Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead end-to-end SAP EHS Global Label Management (GLM) implementations within S/4HANA Product Compliance projects. Youll be responsible for project delivery, stakeholder engagement, and ensuring regulatory alignment. Roles & Responsibilities:- Manage full-cycle implementation of SAP GLM- Define labelling strategies and oversee WWI template delivery- Coordinate cross-functional teams across product safety, compliance, and regulatory domains- Collaborate with clients and business users to gather requirements and translate them into effective EHS solutions.- Configure and maintain the SAP EHS Product Safety module, including specifications, phrase management, and data architecture.- Design and validate WWI report templates and guide ABAP developers with symbol logic, layout, and enhancements.- Implement and support SAP GLM (Global Label Management) including label determination, print control, and output conditions. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP EH&S GLM End to end implementation experience.- Deep expertise in SAP GLM, label determination logic, and print control setup- Strong knowledge of S/4HANA Product Compliance architecture- Excellent communication and team management skills- 8+ years in SAP EHS with 2+ full-cycle GLM implementations Additional Information:- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
6.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Roles and Responsibility Design and implement data models, data flows, and data pipelines to support business intelligence and analytics. Develop and maintain large-scale data warehouses and data lakes using various technologies such as Hadoop, Spark, and NoSQL databases. Collaborate with cross-functional teams to identify business requirements and develop solutions that meet those needs. Ensure data quality, integrity, and security by implementing data validation, testing, and monitoring processes. Stay up-to-date with industry trends and emerging technologies to continuously improve the organization's data architecture capabilities. Provide technical leadership and guidance on data architecture best practices to junior team members. Job Requirements Strong understanding of data modeling, data warehousing, and ETL processes. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Excellent problem-solving skills and ability to analyze complex business problems and develop creative solutions. Strong communication and collaboration skills to work effectively with stakeholders at all levels. Ability to design and implement scalable, secure, and efficient data architectures. Experience working in an agile environment with continuous integration and delivery.
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program ManagerRoles and responsibilities: Understand clients requirement and provide effective and efficient solution in Snowflake. Understanding data transformation and translation requirements and which tools to leverage to get the job done. Ability to do Proof of Concepts (POCs) in areas that need R&D on cloud technologies. Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Technical and Functional Skills: Masters / Bachelors degree in Engineering, Analytics, or a related field. Total 7+ years of experience with relevant ~4+ years of Hands-on experience with Snowflake utilities SnowSQL, SnowPipe, Time travel, Replication, Zero copy cloning. Strong working knowledge on Python. Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. In-depth understanding of data warehouse and ETL tools. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Experience Snowflake APIs is mandatory. Candidate must have strong knowledge in Scheduling and Monitoring using Airflow DAGs. Strong experience in writing SQL Queries, Joins, Store Procedure, User Defined Functions. Should have sound knowledge in Data architecture and design. Should have hands on experience in developing Python scripts for data manipulation. Snowflake snowpro core certification. Developing scripts using Unix, Python, etc.
Posted 1 month ago
11.0 - 18.0 years
30 - 45 Lacs
Hyderabad
Work from Office
Role & responsibilities We are looking for an experienced Data Architect with deep expertise in Snowflake technologies to lead the design, development, and deployment of scalable data architectures. This role involves building robust data pipelines, optimizing data warehouses, and supporting complex data migrations, ensuring data quality, security, and governance across all layers. Preferred candidate profile Data Modeling : Star, Snowflake, Data Vault, hybrid schemas, partitioning, clustering Databases : Snowflake, Oracle, SQL Server, Greenplum, PostgreSQL ETL/ELT Tools : Informatica IDMC, DataStage Big Data Tools : Hadoop, Hive Cloud Integration : AWS Services S3, EC2, Lambda, Glue Programming Languages : Python, PySpark Schedulers : Control-M Data Security : RBAC, data masking, encryption, audit trails, compliance (HIPAA, GDPR) Automation : Advanced SQL, API integration, DevOps practices Data Governance : Data quality, lineage, cataloging, MDM, metadata management
Posted 1 month ago
7.0 - 12.0 years
20 - 35 Lacs
Hyderabad
Work from Office
Exciting opportunity for a Delivery Lead Data Architect to join a high-growth analytics environment. You will be responsible for leading end-to-end technical delivery across data platforms, ensuring robust architecture, performance, and cross-functional alignment. Location : Gurugram (Hybrid) Your Future Employer A reputed analytics-driven organization focused on delivering innovative and scalable data solutions. Known for its inclusive work culture and continuous learning environment. Responsibilities 1) Leading design and delivery across complete SDLC for data and analytics projects 2) Translating business requirements into scalable data architecture and models 3) Collaborating with engineering, BI, testing, and support teams for smooth execution 4) Guiding development of ETL pipelines and reporting workflows 5) Mentoring team on best practices in data modeling, engineering, and architecture 6) Driving client communication, estimations, and technical workshops Requirements 1) Bachelors or Masters in Computer Science, IT, or related field 2) 6+ years of experience in data architecture and delivery leadership 3) Proficiency in SQL, Python, data modeling, and ETL tools 4) Experience with cloud platforms (AWS, Azure, or GCP) and Power BI 5) Strong understanding of SDLC, DevOps, and managed services delivery 6) Excellent communication, stakeholder management, and team leadership skills Whats in it for you 1) Opportunity to lead enterprise-level data programs 2) Work across modern cloud-native technologies 3) Competitive compensation with growth opportunities 4) Inclusive and collaborative work environment
Posted 1 month ago
5.0 - 10.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Key Responsibilities: Design, develop, and maintain Qlik View applications and dashboards. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Perform data analysis and create data models to support business intelligence initiatives. Optimize Qlik View applications for performance and scalability. Provide technical support and troubleshooting for Qlik View applications. Ensure data accuracy and integrity in all Qlik View applications. Integrate Snowflake with Qlik View to enhance data processing and analytics capabilities. Stay updated with the latest Qlik View features and best practices. Conduct training sessions for end-users to maximize the utilization of Qlik View applications. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience between 2-5 years as a Qlik View Developer. Strong knowledge of Qlik View architecture, data modeling, and scripting. Proficiency in SQL and database management. Knowledge of Snowflake and its integration with Qlik View. Excellent analytical and problem-solving skills. Ability to work independently and as part of a team. Strong communication and interpersonal skills.
Posted 1 month ago
9.0 - 14.0 years
10 - 20 Lacs
Mumbai, Bengaluru
Work from Office
Greetings from Future Focus Infotech!!! We have multiple opportunities Data architect Exp: 9+yrs Location : Mumbai / Bangalore Job Type- This is a Permanent position with Future Focus Infotech Pvt Ltd & you will be deputed with our client. A small glimpse about Future Focus Infotech Pvt Ltd. (Company URL: www.focusinfotech.com) If you are interested in above opportunity, send updated CV and below information to reema.b@focusinfotech.com Kindly mention the below details. Total Years of Experience: Current CTC: Expected CTC: Notice Period : Current location: Available for interview on weekdays: Pan Card : Thanks & Regards, Reema reema.b@focusinfotech.com 8925798887
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai
Work from Office
Design and implement data architecture and models for Big Data solutions using MapR and Hadoop ecosystems. You will optimize data storage, ensure data scalability, and manage complex data workflows. Expertise in Big Data, Hadoop, and MapR architecture is required for this position.
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Design and implement data architectures and models, focusing on data warehouses and Snowflake-based environments. Ensure that data is structured for efficient querying and analysis, aligning with business goals and performance requirements.
Posted 1 month ago
10.0 - 17.0 years
12 - 22 Lacs
Gurugram
Work from Office
We know the importance that food plays in people's lives the power it has to bring people, families and communities together. Our purpose is to bring enjoyment to people’s lives through great tasting food, in a way which reflects our values. McCain has recently committed to implementing regenerative agriculture practices across 100 percent of our potato acreage by 2030. Ask us more about our commitment to sustainability. OVERVIEW McCain is embarking on a digital transformation. As part of this transformation, we are making significant investments into our data platforms, common data models. data structures and data policies to increase the quality of our data, the confidence of our business teams to use this data to make better decisions and drive value through the use of data. We have a new and ambitious global Digital & Data group, which serves as a resource to the business teams in our regions and global functions. We are currently recruiting an experienced Data Architect to build enterprise data model McCain. JOB PURPOSE: Reporting to the Data Architect Lead, Global Data Architect will take a lead role in creating the enterprise data model for McCain Foods, bringing together data assets across agriculture, manufacturing, supply chain and commercial. This data model will be the foundation for our analytics program that seeks to bring together McCain’s industry-leading operational data sets, with 3rd party data sets, to drive world-class analytics. Working with a diverse team of data governance experts, data integration architects, data engineers and our analytics team including data scientist, you will play a key role in creating a conceptual, logical and physical data model that underpins the Global Digital & Data team’s activities. . JOB RESPONSIBILITIES: Develop an understanding of McCain’s key data assets and work with data governance team to document key data sets in our enterprise data catalog Work with business stakeholders to build a conceptual business model by understanding the business end to end process, challenges, and future business plans. Collaborate with application architects to bring in the analytics point of view when designing end user applications. Develop Logical data model based on business model and align with business teams Work with technical teams to build physical data model, data lineage and keep all relevant documentations Develop a process to manage to all models and appropriate controls With a use-case driven approach, enhance and expand enterprise data model based on legacy on-premises analytics products, and new cloud data products including advanced analytics models Design key enterprise conformed dimensions and ensure understanding across data engineering teams (including third parties); keep data catalog and wiki tools current Primary point of contact for new Digital and IT programs, to ensure alignment to enterprise data model Be a clear player in shaping McCain’s cloud migration strategy, enabling advanced analytics and world-leading Business Intelligence analytics Work in close collaboration with data engineers ensuring data modeling best practices are followed MEASURES OF SUCCESS: Demonstrated history of driving change in a large, global organization A true passion for well-structured and well-governed data; you know and can explain to others the real business risk of too many mapping tables You live for a well-designed and well-structured conformed dimension table Focus on use-case driven prioritization; you are comfortable pushing business teams for requirements that connect to business value and also able to challenge requirements that will not achieve the business’ goals Developing data models that are not just elegant, but truly optimized for analytics, both advanced analytics use cases and dashboarding / BI tools A coaching mindset wherever you go, including with the business, data engineers and other architects A infectious enthusiasm for learning: about our business, deepening your technical knowledge and meeting our teams Have a “get things done” attitude. Roll up the sleeves when necessary; work with and through others as needed KEY QUALIFICATION & EXPERIENCES: Data Design and Governance At least 5 years of experience with data modeling to support business process Ability to design complex data models to connect and internal and external data Nice to have: Ability profile the data for data quality requirements At least 8 years of experience with requirement analysis; experience working with business stakeholders on data design Experience on working with real-time data. Nice to have: experience with Data Catalog tools Ability to draft accurate documentation that supports the project management effort and coding Technical skills At least 5 years of experience designing and working in Data Warehouse solutions building data models; preference for having S4 hana knowledge. At least 2 years of experience in visualization tools preferably Power BI or similar tools.e At least 2 years designing and working in Cloud Data Warehouse solutions; preference for Azure Databricks, Azure Synapse or earlier Microsoft solutions Experience Visio, Power Designer, or similar data modeling tools Nice to have: Experience in data profiling tools informatica, Collibra or similar data quality tools Nice to have: Working experience on MDx Experience in working in Azure cloud environment or similar cloud environment Must have : Ability to develop queries in SQL for assessing , manipulating, and accessing data stored in relational databases , hands on experience in PySpark, Python Nice to have: Ability to understand and work with unstructured data Nice to have at least 1 successful enterprise-wide cloud migration being the data architect or data modeler. - mainly focused on building data models. Nice to have: Experience on working with Manufacturing /Digital Manufacturing. Nice to have: experience designing enterprise data models for analytics, specifically in a PowerBI environment Nice to have: experience with machine learning model design (Python preferred) Behaviors and Attitudes Comfortable working with ambiguity and defining a way forward. Experience challenging current ways of working A documented history of successfully driving projects to completion Excellent interpersonal skills Attention to the details. Good interpersonal and communication skills Comfortable leading others through change
Posted 1 month ago
5.0 - 10.0 years
17 - 30 Lacs
Bengaluru
Work from Office
Position Overview We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. The ideal candidate will have expertise in Data Architecture, Data Modeling, Data Governance, PySpark, and Databricks to support our data-driven initiatives. You will collaborate with data scientists, analysts, and business teams to ensure high-quality, reliable, and secure data solutions. Key Responsibilities Design and implement scalable data pipelines for batch and real-time processing. Develop and maintain data models to support analytics and business intelligence. Ensure data governance best practices, including data quality, lineage, and compliance. Optimize and manage Databricks environments for efficient data processing. Write and optimize PySpark jobs for large-scale data transformations. Collaborate with stakeholders to define data architecture standards and best practices. Automate data workflows and improve ETL/ELT processes. Troubleshoot and resolve data-related issues in production environments. Required Skills & Qualifications Must-Have: Strong experience in Data Architecture and Data Modeling (relational, dimensional, NoSQL). Hands-on experience with PySpark for big data processing. Proficiency in Azure Databricks (or Databricks on other clouds). Knowledge of Data Governance frameworks (metadata management, data lineage, security). Expertise in SQL and experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with ETL/ELT tools and workflow orchestration (e.g., Airflow, Azure Data Factory). Nice-to-Have: Experience with Delta Lake, Snowflake, or Synapse . Knowledge of CI/CD pipelines for data engineering. Understanding of machine learning data pipelines . Certifications in Databricks, Azure Data Engineer, or AWS Big Data .
Posted 1 month ago
14.0 - 21.0 years
40 - 60 Lacs
Noida, Gurugram, Bengaluru
Hybrid
The opportunity Were looking for Associate Director – Data & AI Strategy. The main objective of the role is to develop and articulate a comprehensive and forward-looking Data & AI strategy that aligns with the overall business strategy and objectives. Skills- Data Engineering, Data Strategy & Governance with Go -to- Market. Skills and attributes for success 15-17 years of total experience with 10+ years in Data Strategy and architecture field 15-17 years of total experience with 7+ years in AI Strategy & implementation. Strong knowledge of data architecture, data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Solid hands-on 10+ years of professional experience with creation and implementation of data science engagements and helping create AI/ML products Proven track record of implementing machine-learning solutions, development in multiple languages and statistical analysis 7+ years experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 7+ years hands-on experience in various Azure services preferred Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients . To qualify for the role, you must have Masters Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 15 to 17 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team. Ideally, youll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. What we look for A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career.
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with Data Engineering.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies. Have a good understanding of IFW industry data model Conversant with industry standard data modelling tools. Information Modelling on master data domains e.g. Party, Agreement, Location + canonical message model Maintaining Logical & Physical data model for Master Data Management solution Maintaining and updating Data Element Catalog for consumers as applicable Supporting business/technical stakeholders on any information required on data models and related artifacts" Additional Information:- The candidate should have minimum 7 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
10.0 - 15.0 years
1 - 6 Lacs
Hyderabad
Hybrid
Role: GCP Data Architect Experience: 10+ years Work location: Hyderabad (Hybrid work from office) Notice period: Immediate joiners to 30 days max (But preferred would be someone who can join within 15 days) Shift timing: 2:30 PM to 11:30 PM (IST) We are looking for a GCP Data Architect with deep technical expertise in cloud-native data platforms and architecture, who also brings experience in building practices/CoEs and engaging in pre-sales solutioning. This role will be instrumental in driving cloud data transformations for enterprise clients, shaping reusable accelerators, and leading conversations with key stakeholders. Required Skills & Qualifications: 10+ years of overall experience, with 3+ years as an architect on GCP data platforms. Expertise in BigQuery, Cloud Storage, Dataflow, Pub/Sub, Looker, Data Catalog, and IAM. Hands-on experience with Terraform or similar IaC tools. Proficiency in Python, SQL, or Apache Beam for data processing pipelines. Solid grasp of data governance, security, lineage, and compliance frameworks. Strong understanding of hybrid/multi-cloud data strategies, data lakehouses, and real-time analytics. Demonstrated ability to build internal CoEs or practices with reusable assets and frameworks. Experience collaborating with CXOs, Data Leaders, and Enterprise Architects. Strong communication skillswritten and verbal—for stakeholder and customer engagement. Preferred Certifications: Professional Cloud Architect – Google Cloud Professional Data Engineer – Google Cloud
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that all components function seamlessly together. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Mentor junior professionals to help them develop their skills and grow within the organization. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies.- Ability to analyze and optimize data models for performance and scalability. Additional Information:- The candidate should have minimum 5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Ab Initio Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful execution of projects, performing maintenance and enhancements, and contributing to the development of innovative solutions that meet client needs. You will be responsible for managing your tasks effectively while ensuring high-quality deliverables and maintaining a focus on continuous improvement. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of learning.- Monitor project progress and provide timely updates to stakeholders to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of data processing workflows.- Familiarity with data quality and data governance principles.- Ability to troubleshoot and resolve technical issues in a timely manner. Additional Information:- The candidate should have minimum 7.5 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France