Home
Jobs
Companies
Resume

13124 Etl Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderābād

On-site

We are seeking an experienced IBM ITX Developer with a strong background in IBM Integration Tool (ITX) to join our team. The ideal candidate will have hands-on expertise in developing, implementing, and debugging ITX maps, along with managing EDI transactions in the Logistics & Supply chain domain. Primary Skills: IBM ITX (WebSphere Transformation Extender) Development: Proficiency in designing, developing, and debugging ITX maps. Data Formats: Experience with various data formats, including ANSI X12, EDIFACT, XML, JSON, FF, and CSV. Experience or strong knowledge of communication and data handling protocols ( AS2, EDI, HTTPS, FTPS/SFTP/SCP/OFTP, POP3/SMTP, Web Services, etc. ) Good To Have Logistics Domain Knowledge: Familiarity with logistics processes, supply chain management, and warehouse operations. Communication Protocols: Experience with communication protocols like AS2, SFTP, and others relevant to logistics. Technical Problem-Solving: Ability to identify, analyse, and resolve technical issues independently. Teamwork and Collaboration: Strong interpersonal skills and the ability to work effectively in a team environment. Nice To Have: Experience working with IBM complementary products: IBM WebSphere family, IBM PEM Community Manager (PCM), Lightwell B2B Framework Working experience with ETL, Java, Python, Cloud technologies is strong plus. Familiarity with Supply Chain & Logistics Domain (EDI 204, 210, 214, 810, 820, Etc.). Education: Bachelors’ degree or equivalent in Computer Science, MIS, or similar discipline. Accreditation: Specific business accreditation for Business Intelligence. Experience: Relevant work experience in data engineering based on the following number of years: Associate: Prior experience not required Standard I: Two (2) years Standard II: Three (3) years Senior I: Four (4) years Senior II: Five (5) years Knowledge, Skills and Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Preferred Qualifications: Pay Transparency: Pay: Additional Details: FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 22 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Solution Architects assess a project’s technical feasibility, as well as implementation risks. They are responsible for the design and implementation of the overall technical and solution architecture. They define the structure of a system, its interfaces, the solution principles guiding the organisation, the software design and the implementation. The scope of the Solution Architect’s role is defined by the business issue at hand. To fulfil the role, a Solution Architect utilises business and technology expertise and experience. Job Description - Grade Specific Managing Solution/Delivery Architect - Design, deliver and manage complete solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certification: preferably Capgemini Architects certification level 2 or above, relevant solution certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Skills (competencies) (SDLC) Methodology Active Listening Adaptability Agile (Software Development Framework) Analytical Thinking APIs Automation (Frameworks) AWS (Cloud Platform) AWS Architecture Business Acumen Business Analysis C# Capgemini Integrated Architecture Framework (IAF) Cassandra (Relational Database) Change Management Cloud Architecture Coaching Collaboration Confluence Delegation DevOps Docker ETL Tools Executive Presence GitHub Google Cloud Platform (GCP) Google Cloud Platform (GCP) (Cloud Platform) IAF (Framework) Influencing Innovation Java (Programming Language) Jira Kubernetes Managing Difficult Conversations Microsoft Azure DevOps Negotiation Network Architecture Oracle (Relational Database) Problem Solving Project Governance Python Relationship-Building Risk Assessment Risk Management SAFe Salesforce (Integration) SAP (Integration) SharePoint Slack SQL Server (Relational Database) Stakeholder Management Storage Architecture Storytelling Strategic Thinking Sustainability Awareness Teamwork Technical Governance Time Management TOGAF (Framework) Verbal Communication Written Communication Show more Show less

Posted 22 hours ago

Apply

3.0 - 4.0 years

4 - 12 Lacs

Hyderābād

On-site

Job Description: Summary The Data Engineer will be responsible for designing, developing, and maintaining the data infrastructure for a healthcare organization. The ideal candidate will have experience in working with healthcare data, including EHR, HIMS, PACS, and RIS. They will also have experience with SQL, Elasticsearch, and data integration tools such as Talend. Key Responsibilities:  Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Microsoft Fabric.  Data Integration: Integrate data from various sources, ensuring data quality and consistency.  Data Transformation: Perform data cleaning, transformation, and aggregation to support analytics and reporting.  Performance Optimization: Optimize data processing workflows for performance and scalability.  Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.  Documentation: Create and maintain documentation for data processes, workflows, and infrastructure. Required Skills and Qualifications:  Experience: 3-4 years of experience in data engineering or related field.  Technical Skills:  Proficiency in Microsoft Fabric and its components.  Strong knowledge of SQL and database management systems.  Experience with big data technologies (e.g., Spark, Hadoop).  Familiarity with data warehousing concepts and ETL processes.  Programming Skills: Proficiency in programming languages such as Python, Java, or Scala. Python will be preferable.  Analytical Skills: Strong problem-solving skills and ability to analyze complex data sets.  Communication Skills: Excellent verbal and written communication skills. Preferred Qualifications:  Certifications: Relevant certifications in data engineering or Microsoft technologies.  Experience: Experience with cloud platforms. Working in Azure is a must.  Tools: Familiarity with data visualization tools (e.g., Power BI, Tableau). Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Paid time off Schedule: Day shift Monday to Friday Experience: Data Engineer: 3 years (Preferred) SQL: 2 years (Preferred) Python: 2 years (Preferred) ETL: 2 years (Preferred) Spark: 2 years (Preferred) Azure: 2 years (Preferred) Work Location: In person

Posted 22 hours ago

Apply

7.0 years

0 Lacs

Hyderābād

On-site

Digital Solutions Consultant I - HYD015Q Company : Worley Primary Location : IND-AP-Hyderabad Job : Digital Solutions Schedule : Full-time Employment Type : Agency Contractor Job Level : Experienced Job Posting : Jun 16, 2025 Unposting Date : Jul 16, 2025 Reporting Manager Title : Senior General Manager : We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities: Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.

Posted 22 hours ago

Apply

6.0 - 10.0 years

0 Lacs

Delhi

On-site

Job requisition ID :: 84234 Date: Jun 15, 2025 Location: Delhi Designation: Senior Consultant Entity: What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Learn more about Analytics and Information Management Practice Work you’ll do As a Senior Consultant in our Consulting team, you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations. You’ll: We are seeking a highly skilled Senior AWS DevOps Engineer with 6-10 years of experience to lead the design, implementation, and optimization of AWS cloud infrastructure, CI/CD pipelines, and automation processes. The ideal candidate will have in-depth expertise in Terraform, Docker, Kubernetes, and Big Data technologies such as Hadoop and Spark. You will be responsible for overseeing the end-to-end deployment process, ensuring the scalability, security, and performance of cloud systems, and mentoring junior engineers. Overview: We are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Exp- 2 to 7 years Location- Bangalore, Chennai, Coimbatore, Delhi, Mumbai, Bhubaneswar. Key Responsibilities: 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Required Qualifications: 1. bachelor’s degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions Big Data: Hadoop, Spark, Delta Lake Programming: Python, PySpark Databases: SQL, PostgreSQL, NoSQL Data Warehousing and Analytics ETL/ELT processes Data Lake architectures Version control: Github Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.

Posted 22 hours ago

Apply

6.0 - 10.0 years

0 Lacs

Delhi

On-site

Job requisition ID :: 84245 Date: Jun 15, 2025 Location: Delhi Designation: Consultant Entity: What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Learn more about Analytics and Information Management Practice Work you’ll do As a Senior Consultant in our Consulting team, you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations. You’ll: We are seeking a highly skilled Senior AWS DevOps Engineer with 6-10 years of experience to lead the design, implementation, and optimization of AWS cloud infrastructure, CI/CD pipelines, and automation processes. The ideal candidate will have in-depth expertise in Terraform, Docker, Kubernetes, and Big Data technologies such as Hadoop and Spark. You will be responsible for overseeing the end-to-end deployment process, ensuring the scalability, security, and performance of cloud systems, and mentoring junior engineers. Overview: We are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Exp- 2 to 7 years Location- Bangalore, Chennai, Coimbatore, Delhi, Mumbai, Bhubaneswar. Key Responsibilities: 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Required Qualifications: 1. bachelor’s degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions Big Data: Hadoop, Spark, Delta Lake Programming: Python, PySpark Databases: SQL, PostgreSQL, NoSQL Data Warehousing and Analytics ETL/ELT processes Data Lake architectures Version control: Github Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.

Posted 22 hours ago

Apply

8.0 - 9.0 years

0 Lacs

Gurgaon

Remote

Colt provides network, voice and data centre services to thousands of businesses around the world, allowing them to focus on delivering their business goals instead of the underlying infrastructure. Job ID : 35500 Job Level: PT1 Core Job Location: Gurgaon/Bangalore Function: DIO Employment Type: Full time Working Pattern: Hybrid Why we need this role: The Salesforce Platform & Integration Lead Developer is required to design, develop, and maintain custom solutions and integrations on the Salesforce platform across Sales Cloud, Experience Cloud and Marketing Cloud. This role will have deep expertise in Salesforce platform development, API integrations, and best practices, ensuring seamless data flow and high-performing solutions. What you will do: Key accountabilities: Leadership o Lead architecture and design discussions with technical architects, solution architects, technical teams, and developers. o Lead and mentor junior Salesforce developers and administrators providing technical guidance and support. Platform Development: o Design and develop scalable, high-performing Salesforce solutions across Sales Cloud, Experience Cloud and Marketing Cloud. o Design and develop custom solutions on the Salesforce platform using Apex, Visualforce, Lightning components, integrations and other Salesforce technologies. o Ensure data integrity, security, and compliance within the Salesforce environment. o Implement and maintain Salesforce configurations, custom objects, workflows, validation rules, and other standard functionalities. o Develop and maintain Lightning Web Components to enhance the user experience. o Stay updated with Salesforce innovations and emerging technologies to optimize solutions. o Support business teams in identifying opportunities for automation and process improvements. Integration Development: o Design and implement integrations between Salesforce and other enterprise systems using APIs, middleware tools like MuleSoft, third-party tools and data integration techniques. o Ensure seamless data flow and synchronisation between Salesforce and external systems. o Troubleshoot and resolve integration issues, ensuring data integrity and consistency. Quality Assurance: o Conduct code reviews to ensure code quality, performance, and security. o Develop and execute unit tests, integration tests, and user acceptance tests. o Troubleshoot and resolve issues related to Salesforce applications and integrations. Documentation & Training: o Create and maintain technical documentation for Salesforce solutions and integrations. o Provide training and support to end-users and team members on Salesforce functionalities and best practices. What you will do: 8-9 years of experience in Salesforce development and integration. Proficiency in Salesforce platforms such as Sales Cloud, Experience Cloud, and Marketing Cloud. Experience in multi-cloud Salesforce implementations. Proven experience with Apex, Visualforce, Lightning components, Salesforce Flows and Salesforce integrations. Strong understanding of Salesforce best practices, design patterns, and development methodologies. Experience with integration tools such as Mulesoft, WebMethods, REST/SOAP APIs, and ETL solutions. Experience with JavaScript, HTML, CSS, and web services. Knowledge and ability to follow SDLC as well as Agile/Scrum methodologies. Experience in standard operating procedures to perform pre and post-production support activities. Good verbal and presentation skills. Good written communication skills. Good problem-solving skills. Qualifications: BE/B.Tech/ME/M.Tech/MCA/M.Sc Certifications in the following will be strongly preferred: o Salesforce Certified Platform Developer I & II o Salesforce Certified Integration Architecture Designer Skills Applications System Design Applications Development System Maintenance and Enhancement System Development Life Cycle Applications Knowledge Education A bachelor’s or master’s degree in computer science, software engineering, or closely related field What we offer you: Looking to make a mark? At Colt, you’ll make a difference. Because around here, we empower people. We don’t tell you what to do. Instead, we employ people we trust, who come together across the globe to create intelligent solutions. Our global teams are full of ambitious, driven people, all working together towards one shared purpose: to put the power of the digital universe in the hands of our customers wherever, whenever and however they want. We give our people the opportunity to inspire and lead teams, and work on projects that connect people, cities, businesses, and ideas. We want you to help us change the world, for the better. Diversity and inclusion Inclusion and valuing diversity of thought and experience are at the heart of our culture here at Colt. From day one, you’ll be encouraged to be yourself because we believe that’s what helps our people to thrive. We welcome people with diverse backgrounds and experiences, regardless of their gender identity or expression, sexual orientation, race, religion, disability, neurodiversity, age, marital status, pregnancy status, or place of birth. Most recently we have: Signed the UN Women Empowerment Principles which guide our Gender Action Plan Trained 60 (and growing) Colties to be Mental Health First Aiders Please speak with a member of our recruitment team if you require adjustments to our recruitment process to support you. For more information about our Inclusion and Diversity agenda, visit our DEI pages. Benefits Our benefits support you through all parts of life, for both physical and mental health. Flexible working hours and the option to work from home. Extensive induction program with experienced mentors and buddies. Opportunities for further development and educational opportunities. Global Family Leave Policy. Employee Assistance Program. Internal inclusion & diversity employee networks. A global network When you join Colt you become part of our global network. We are proud of our colleagues and the stories and experience they bring – take a look at ‘Our People’ site including our Empowered Women in Tech.

Posted 22 hours ago

Apply

4.0 years

0 Lacs

Gurgaon

Remote

Job Title: Senior Digital Analyst (MCI) Location: Chandigarh, India Department: Data Analyst Job Type: Full-Time About Us: TRU IT is a global leader dedicated to leveraging cutting-edge technology to drive business innovation and growth. We specialize in crafting data driven digital strategies, optimizing marketing performance, and delivering transformative insights that empower businesses. Our expertise spans multiple industries, combining advanced analytics, digital marketing, and emerging technologies to drive measurable results. Position Overview: We are seeking an experienced Senior Digital Analyst with a strong background in Marketing Cloud Intelligence (MCI) , BigQuery , Snowflake , etc. The ideal candidate will be responsible for managing and analyzing marketing data, optimizing performance, and developing data-driven strategies to enhance business growth. This role requires expertise in MCI (formerly Datorama) for reporting, visualization, and campaign performance tracking . Job Location and Address: This is a full-time onsite role (no hybrid or remote option) at the following location: Plot No E 275, Industrial Area, Sector 75, Sahibzada Ajit Singh Nagar, Punjab 160071 Responsibilities: 1. Marketing Cloud Intelligence (MCI) & Digital Analytics: Manage and optimize MCI dashboards to track marketing performance, campaign effectiveness, and business KPIs. Develop custom data models within MCI to aggregate, clean, and transform marketing data from multiple sources. Automate data ingestion and transformation pipelines in MCI for seamless reporting. Perform advanced marketing analytics to identify trends, improve attribution models, and enhance campaign effectiveness. 2. Data Management & Integration: Develop and maintain data architectures using Snowflake, BigQuery etc. Extract, process, and analyze large datasets from multiple marketing platforms. Integrate MCI with Google Analytics, Adobe Analytics, and Google Tag Manager to consolidate reporting and drive actionable insights. Optimize ETL pipelines to ensure efficient data processing and reporting. 3. Performance Reporting & Business Insights: Develop custom dashboards in MCI, Looker Studio, and Excel for marketing performance tracking. Analyze multi-channel marketing campaigns (PPC, social, programmatic) and provide optimization recommendations. Deliver monthly, quarterly, and ad-hoc reports to key stakeholders on marketing performance and ROI. Conduct cohort and segmentation analysis to improve customer retention and acquisition strategies. 4. Collaboration & Strategy: Work closely with marketing, product, and data teams to align data-driven insights with business goals. Provide recommendations to optimize budget allocation, audience targeting, and media spend efficiency. Stay updated on MCI enhancements, industry trends, and new analytics tools. Requirements: Bachelor’s or Master’s degree in Data Science, Computer Science, Marketing, Business Analytics, or a related field. 4+ years of experience in digital marketing analytics, business intelligence, and data management. Proven expertise in MCI (Marketing Cloud Intelligence/Datorama), including dashboard development and data transformations. Strong hands-on experience with Snowflake , BigQuery , and SQL . Experience in Adobe Analytics, Google Analytics (GA4) , etc. Experience in ETL processes, API integrations , and marketing data automation . Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced environment, managing multiple projects and deadlines. What We Offer: Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and innovative work environment.

Posted 22 hours ago

Apply

175.0 years

2 - 4 Lacs

Gurgaon

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? American Express’ Internal Audit Group (IAG) has reinvented our audit process and is leading the financial services industry with our Data-Driven Continuous Auditing methodology embedding intelligence through the audit lifecycle. IAG’s strategic initiatives, combined with our greatest asset – our people – enable IAG to utilize advanced data analysis capabilities, provide greater and continuous assurance, forward looking risk insights, and help ensure quality products and services are provided to American Express customers. IAG Analytics & Insights team is looking for those who share our mission and aspirations and are passionate about the use of data and technology in a collaborative, people and risk-focused environment. We are looking for a dynamic leader to drive our Data Management and Business Intelligence (BI) agenda. This role will combine strategic vision with hands-on execution to build and optimize data pipelines, BI solutions, and analytic systems that empower decision-making for the department & enterprise at large Key Responsibilities: Leadership and Strategy Lead and mentor a cross-functional team of BI developers, engineers, and project managers. Define and execute the data and BI strategy, aligning with business priorities. Partner with business stakeholders to prioritize and deliver impactful analytics solutions. Project Management Manage the full lifecycle of BI and analytic projects, including scoping, planning, resource allocation, and timeline management. Ensure projects are delivered on time, within scope and budget, with clear reporting to leadership. Solution Development Guide the development and scaling of data pipelines, reporting systems, and BI tools. Ensure solutions are high-performing, user-friendly, and adhere to data governance standards Support cloud migrations including integration of BI and Machine Learning tools for analytic development & production solutions Provide leadership & oversight for development & deployment of analytic solutions (including advanced analytics) across Audit portfolios Enablement & Adoption Serve as a bridge between business users and technical teams Promote adoption of BI solutions through training, support, and change management Drive process improvement and automation within BI workflows Governance and Compliance Implement and enforce data governance and data quality standards to ensure data integrity and security. Oversee the development and adherence to best practices for data access, reporting, and compliance with industry regulations. Qualifications Bachelor’s degree in Computer Science, Information Technology, Business Administration, or a related field. MBA or advanced degrees preferred. 10+ years of experience in data and business intelligence, with at least 5 years in a leadership or managerial role. Experience with cloud data platforms (AWS, Azure, Google Cloud). Strong expertise in BI tools (e.g., Power BI, Tableau, Qlik), automation solutions and data modeling techniques. Experience with data integration, ETL processes, and data warehousing concepts. Proven ability to design and implement end-to-end BI solutions and data architectures. Experience managing cross-functional teams and driving organizational change. Expertise in data governance, security, and compliance best practices. Excellent communication and interpersonal skills, with the ability to engage with both technical teams and business stakeholders. Project management experience and familiarity with Agile methodologies. Strong problem-solving and analytical skills, with a focus on delivering actionable insights from complex data. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 22 hours ago

Apply

0 years

12 - 20 Lacs

Gurgaon

Remote

Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana

Posted 22 hours ago

Apply

8.0 years

2 - 8 Lacs

Gurgaon

On-site

Requisition Number: 101352 Architect I - Data Location: This is a hybrid opportunity in Delhi-NCR, Bangalore, Hyderabad, Gurugram area. Insight at a Glance 14,000+ engaged teammates globally with operations in 25 countries across the globe. Received 35+ industry and partner awards in the past year $9.2 billion in revenue #20 on Fortune’s World's Best Workplaces™ list #14 on Forbes World's Best Employers in IT – 2023 #23 on Forbes Best Employers for Women in IT- 2023 $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About the role As an Architect I , you will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. We will count on you to be involved in designing and implementing end-to-end data pipelines using cloud services and data frameworks. Along the way, you will get to: Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. What we’re looking for 8+ years in Business Intelligence (BI) solution design, with 6+ years specializing in ETL processes and data warehouse architecture. 6+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric. Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field What you can expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. Medical Insurance Health Benefits Professional Development: Learning Platform and Certificate Reimbursement Shift Allowance But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Today's talent leads tomorrow's success. Learn more about Insight: https://www.linkedin.com/company/insight/ Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India

Posted 22 hours ago

Apply

5.0 years

10 - 15 Lacs

Gurgaon

On-site

Job Role: ETL SSIS+SSAS Job Location: Gurugram Interview: 1 Internal technical Round|| PM Round ||Client Interview (Candidate should be ready for F2F) Experience Range- 5-8 Year Job Description: · Microsoft SQL · Microsoft SSIS, SSAS · Data warehouse/Data Migration · Experience in Analytics / OLAP Cube Development (Microsoft SSAS and MDX). · Analyze, design, build, query, troubleshoot and maintain cubes. · Knowledge/expertise in SSAS Tabular with DAX language, Proficient in MDX and DAX · Strong conceptual knowledge of ETL fundamentals · Exposure to the following will be beneficial. Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,500,000.00 per year Schedule: Day shift Work Location: In person

Posted 22 hours ago

Apply

0 years

5 - 7 Lacs

Gurgaon

On-site

About the role Support the design, development, and implementation of our People analytics and data strategy. Our goal is to increase our analytical capabilities and derive actionable insights into our critical business issues, as well as create a scalable data infrastructure and user friendly reporting environment that can effectively support our growing company. What you’ll do : Continually look for ways to improve regular delivery of standard analysis/reporting through automation, streamlining, and migration to self-serve platforms. Support release and QA activities for data pipelines & dashboards enhancements. This includes the development and execution of testing strategies. Support initiatives to partner with IT to build a well-structured, easy to work with HR data warehouse that contains key business metrics in areas such as Quality of Hire, productivity, and resourcing Support bug/issue resolution processes: root cause analysis, impact analysis and solution design. This includes implementing, testing and deploying solutions Execution of general administration and reporting tasks pertaining to People Analytics and Systems management. Provide analytical support to projects that improve our performance (e.g. Quality of Hire): requirements, discussions, problem solving, analytics, share insights, build solutions, drive change Support broader People Analytics team objectives in delivery of tasks, projects, and enhancements. Create high quality analytics/reports and translate them into value added decisions and actions Support business reporting/data needs across Gartner accurately. What you’ll need : 1-3 yrs Experience in data automation, analytics, problem solving Preferred Bachelor’s Degree or equivalent in: Computer Science, Computer Engineering, Engineering, Management Science, Data Science Interests in pursuing a career in HR analytics, data engineering, data analytics and visualization, business intelligence, or analytical consulting Has the aptitude to use data, analytics, and business knowledge to solve complex business problems. Hands on in visualization tools (PowerBI, etc.), Visier Experience a plus Preferred experience in the following tools: Microsoft Azure, Azure Data Factory, Data Pipelining, Data Transformation, and ETL. What you will get: Competitive salary, generous paid time off policy, charity match program, Group Medical Insurance, Parental Leave, Employee Assistance Program (EAP) Collaborative, team-oriented culture that embraces diversity. Professional development and unlimited growth opportunities. #LI-A13 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:100508 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 22 hours ago

Apply

8.0 years

28 - 30 Lacs

Pune

On-site

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 22 hours ago

Apply

2.0 - 5.0 years

15 Lacs

Mumbai

On-site

Job Title: AWS Data Engineer The role typically involves working various shifts to support customers in a 24/7 roster-based model within an office environment Summary: We are seeking an experienced AWS Data Engineer to join our TC - Data and AIoT department. As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure on the AWS platform. You will work closely with cross-functional teams to ensure efficient data flow and integration, enabling effective data analysis and reporting. Roles and Responsibilities: 1. Design and develop data pipelines and ETL processes on the AWS platform, ensuring scalability, reliability, and performance. 2. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. 3. Implement data governance and security measures to ensure compliance with industry standards and regulations. 4. Optimize data storage and retrieval processes to enhance data accessibility and performance. 5. Troubleshoot and resolve data-related issues, ensuring data quality and integrity. 6. Monitor and maintain data pipelines, ensuring timely data ingestion and processing. 7. Stay up-to-date with the latest AWS services and technologies, and evaluate their potential for enhancing our data infrastructure. 8. Collaborate with DevOps teams to automate deployment and monitoring of data pipelines. 9. Document technical specifications, processes, and procedures related to data engineering. Qualifications: 1. Bachelor's degree in Computer Science, Engineering, or a related field. 2. 2-5 years of experience in data engineering, with a focus on AWS technologies. 3. Strong knowledge of AWS services such as S3, Glue, Redshift, Athena, EMR and Lambda. 4. Proficiency in programming languages such as Python, SQL, and Scala. 5. Experience with data modeling, data warehousing, and ETL processes. 6. Familiarity with data governance and security best practices. 7. Strong analytical and problem-solving skills. 8. Excellent communication and collaboration abilities. 9. AWS certifications (e.g., AWS Certified Big Data - Specialty) are a plus. Job Type: Full-time Pay: Up to ₹1,500,000.00 per year Schedule: Rotational shift Work Location: In person

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Job Description Role Overview A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and infrastructure that facilitate the collection, storage, and processing of large datasets. They collaborate with data scientists and analysts to ensure data is accessible, reliable, and optimized for analysis. Key tasks include data integration, ETL (Extract, Transform, Load) processes, and managing databases and cloud-based systems. Data engineers play a crucial role in enabling data-driven decision-making and ensuring data quality across organizations. What Will You Do In This Role Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements. Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories. Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality. Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management. Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods. Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products. Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams. Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities. Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts. Analyze data requirements and translate them into technical specifications for ETL processes. Develop and maintain ETL workflows, ensuring optimal performance and error handling mechanisms are in place. Monitor and troubleshoot ETL processes to ensure timely and successful data delivery. Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies. Document integration processes, data mappings, and ETL workflows to maintain clear communication and ensure knowledge transfer. What Should You Have Bachelor’s degree in information technology, Computer Science or any Technology stream 5+ years of working experience with enterprise data integration technologies – Informatica PowerCenter, Informatica Intelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration) Integration experience utilizing REST and Custom API integration Experiences in Relational Database technologies and Cloud Data stores from AWS, GCP & Azure Experience utilizing AWS cloud well architecture framework, deployment & integration and data engineering. Preferred experience with CI/CD processes and related tools including- Terraform, GitHub Actions, Artifactory etc. Proven expertise in Python and Shell scripting, with a strong focus on leveraging these languages for data integration and orchestration to optimize workflows and enhance data processing efficiency Extensive Experience in design of reusable integration pattern using the cloud native technologies Extensive Experience Process orchestration and Scheduling Integration Jobs in Autosys, Airflow. Experience in Agile development methodologies and release management techniques Excellent analytical and problem-solving skills Good Understanding of data modeling and data architecture principles Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business, Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Management Process, Social Collaboration, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills Job Posting End Date 07/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R353285 Show more Show less

Posted 22 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are seeking an experienced IBM ITX Developer with a strong background in IBM Integration Tool (ITX) to join our team. The ideal candidate will have hands-on expertise in developing, implementing, and debugging ITX maps, along with managing EDI transactions in the Logistics & Supply chain domain. Primary Skills IBM ITX (WebSphere Transformation Extender) Development: Proficiency in designing, developing, and debugging ITX maps. Data Formats: Experience with various data formats, including ANSI X12, EDIFACT, XML, JSON, FF, and CSV. Experience or strong knowledge of communication and data handling protocols (AS2, EDI, HTTPS, FTPS/SFTP/SCP/OFTP, POP3/SMTP, Web Services, etc.) Good To Have Logistics Domain Knowledge: Familiarity with logistics processes, supply chain management, and warehouse operations. Communication Protocols: Experience with communication protocols like AS2, SFTP, and others relevant to logistics. Technical Problem-Solving: Ability to identify, analyse, and resolve technical issues independently. Teamwork and Collaboration: Strong interpersonal skills and the ability to work effectively in a team environment. Nice To Have Experience working with IBM complementary products: IBM WebSphere family, IBM PEM Community Manager (PCM), Lightwell B2B Framework Working experience with ETL, Java, Python, Cloud technologies is strong plus. Familiarity with Supply Chain & Logistics Domain (EDI 204, 210, 214, 810, 820, Etc.). Education: Bachelors’ degree or equivalent in Computer Science, MIS, or similar discipline. Accreditation: Specific business accreditation for Business Intelligence. Experience: Relevant work experience in data engineering based on the following number of years: Associate: Prior experience not required Standard I: Two (2) years Standard II: Three (3) years Senior I: Four (4) years Senior II: Five (5) years Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Preferred Qualifications Pay Transparency: Pay Additional Details: FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less

Posted 22 hours ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Manager, Quality Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centres focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s’ IT operating model, Tech Centres are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Centre helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centres. Role Overview Develop and Implement Advanced Automated Testing Frameworks Architect, design, and maintain sophisticated automated testing frameworks for data pipelines and ETL processes, ensuring robust data quality and reliability. Conduct Comprehensive Quality Assurance Testing Lead the execution of extensive testing strategies, including functional, regression, performance, and security testing, to validate data accuracy and integrity across the bronze layer. Monitor and Enhance Data Reliability Collaborate with the data engineering team to establish and refine monitoring and alerting systems that proactively identify data quality issues and system failures, implementing corrective actions as needed. What Will You Do In This Role Develop and Implement Advanced Automated Testing Frameworks Architect, design, and maintain sophisticated automated testing frameworks for data pipelines and ETL processes, ensuring robust data quality and reliability. Conduct Comprehensive Quality Assurance Testing Lead the execution of extensive testing strategies, including functional, regression, performance, and security testing, to validate data accuracy and integrity across the bronze layer. Monitor and Enhance Data Reliability Collaborate with the data engineering team to establish and refine monitoring and alerting systems that proactively identify data quality issues and system failures, implementing corrective actions as needed. Leverage Generative AI Innovate and apply generative AI techniques to enhance testing processes, automate complex data validation scenarios, and improve overall data quality assurance workflows. Collaborate with Cross-Functional Teams Serve as a key liaison between Data Engineers, Product Analysts, and other stakeholders to deeply understand data requirements and ensure that testing aligns with strategic business objectives. Document and Standardize Testing Processes Create and maintain comprehensive documentation of testing procedures, results, and best practices, facilitating knowledge sharing and continuous improvement across the organization. Drive Continuous Improvement Initiatives Lead efforts to develop and implement best practices for QA automation and reliability, including conducting code reviews, mentoring junior team members, and optimizing testing processes. What You Should Have Educational Background Bachelor's degree in computer science, Engineering, Information Technology, or a related field Experience 4+ years of experience in QA automation, with a strong focus on data quality and reliability testing in complex data engineering environments. Technical Skills Advanced proficiency in programming languages such as Python, Java, or similar for writing and optimizing automated tests. Extensive experience with testing frameworks and tools (e.g., Selenium, JUnit, pytest) and data validation tools, with a focus on scalability and performance. Deep familiarity with data processing frameworks (e.g., Apache Spark) and data storage solutions (e.g., SQL, NoSQL), including performance tuning and optimization. Strong understanding of generative AI concepts and tools, and their application in enhancing data quality and testing methodologies. Proficiency in using Jira Xray for advanced test management, including creating, executing, and tracking complex test cases and defects. Analytical Skills Exceptional analytical and problem-solving skills, with a proven ability to identify, troubleshoot, and resolve intricate data quality issues effectively. Communication Skills Outstanding verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical stakeholders. Preferred Qualifications Experience with Cloud Platforms Extensive familiarity with cloud data services (e.g., AWS, Azure, Google Cloud) and their QA tools, including experience in cloud-based testing environments. Knowledge of Data Governance In-depth understanding of data governance principles and practices, including data lineage, metadata management, and compliance requirements. Experience with CI/CD Pipelines Strong knowledge of continuous integration and continuous deployment (CI/CD) practices and tools (e.g., Jenkins, GitLab CI), with experience in automating testing within CI/CD workflows. Certifications Relevant certifications in QA automation or data engineering (e.g., ISTQB, AWS Certified Data Analytics) are highly regarded. Agile Methodologies Proven experience working in Agile/Scrum environments, with a strong understanding of Agile testing practices and principles. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills Job Posting End Date 08/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R345312 Show more Show less

Posted 22 hours ago

Apply

4.0 - 6.0 years

0 Lacs

Pune

On-site

Job requisition ID :: 82154 Date: Jun 15, 2025 Location: Pune Designation: Consultant Entity: 4-6 years of experience in Cognos BI development (Cognos Analytics, Cognos 11, Framework Manager). Strong expertise in Cognos Report Studio, Query Studio, Analysis Studio, and Active Reports . Experience with Cognos Framework Manager for building metadata models. Proficiency in SQL and relational databases (e.g., Oracle, SQL Server, DB2 ). Knowledge of ETL processes and experience integrating Cognos with data warehouses. Familiarity with Cognos Administration (deployment, security, and performance tuning). Strong analytical and problem-solving skills. Excellent communication and stakeholder management skills. Develop, enhance, and maintain Cognos reports, dashboards, and metadata models. Work with business stakeholders to gather reporting requirements and translate them into technical solutions. Design and optimize complex Cognos Framework Manager models for efficient reporting and performance. Implement data security and governance best practices within Cognos reports and dashboards. Perform performance tuning and troubleshooting of Cognos reports and queries. Collaborate with the database team to optimize SQL queries for report generation. Ensure data accuracy and consistency across all reports and dashboards. Support Cognos system upgrades, patches, and maintenance activities. Document report development processes, configurations, and best practices

Posted 22 hours ago

Apply

3.0 years

0 Lacs

Mumbai

Remote

Experience: 3 to 4 Years Y Location: Mumbai, Maharashtra India Openings: 2 Job description: Key responsibilities: Apply design and data analysis techniques to organize the presentation of data in innovative ways, collaborate with research analysts to identify the best means of visually depicting a story Design and Develop custom dashboard solutions, as well as re-usable data visualization templates Analyze data, and identify trends and discover insights that will guide strategic leadership decisions On daily practise use JavaScript, Tableau, QlikView, QlikSense, SAS Visual Analytics, PowerBI, Dashboard design/development. Desired Qualifications: M.sc or PhD in corresponding fields; Hands-on experience of programming languages (e.g., Python, Java, Scala) and/ or Big Data systems (like Hadoop, Spark, Storm); Experience with Linux, Unix shell scripting, noSQL, Machine Learning; Knowledge and experience with cloud environments like AWS/Azure/GCP; Knowledge of Scrum, Agile. Requirement: Required Qualifications: Experience with visual reports and dynamic dashboards design and development on platforms like Tableau, Qlik, PowerBI, SAS, or CRM Analytics. Experience with SQL, ETL, data warehousing, BI. Knowledge of Big Data. Strong verbal and written communication skills in English. Benefits: Competitive salary 2625 – 4500 EUR gross Flexible vacation + health & travel insurance + relocation Work from home, flexible working hours Work with Fortune 500 companies from different industries all over the world Skills development and training opportunities, company-paid certifications Opportunities to advance career An open-minded and inclusive company culture Role: Visualization Expert Department: UI/UX Education: Bachelor’s Degree from Computer Science, Statistics, Applied Mathematics, or another related field

Posted 22 hours ago

Apply

0 years

4 - 5 Lacs

Chennai

On-site

Job Description Solution Architects assess a project’s technical feasibility, as well as implementation risks. They are responsible for the design and implementation of the overall technical and solution architecture. They define the structure of a system, its interfaces, the solution principles guiding the organisation, the software design and the implementation. The scope of the Solution Architect’s role is defined by the business issue at hand. To fulfil the role, a Solution Architect utilises business and technology expertise and experience. Job Description - Grade Specific Managing Solution/Delivery Architect - Design, deliver and manage complete solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certification: preferably Capgemini Architects certification level 2 or above, relevant solution certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Skills (competencies) (SDLC) Methodology Active Listening Adaptability Agile (Software Development Framework) Analytical Thinking APIs Automation (Frameworks) AWS (Cloud Platform) AWS Architecture Business Acumen Business Analysis C# Capgemini Integrated Architecture Framework (IAF) Cassandra (Relational Database) Change Management Cloud Architecture Coaching Collaboration Confluence Delegation DevOps Docker ETL Tools Executive Presence GitHub Google Cloud Platform (GCP) Google Cloud Platform (GCP) (Cloud Platform) IAF (Framework) Influencing Innovation Java (Programming Language) Jira Kubernetes Managing Difficult Conversations Microsoft Azure DevOps Negotiation Network Architecture Oracle (Relational Database) Problem Solving Project Governance Python Relationship-Building Risk Assessment Risk Management SAFe Salesforce (Integration) SAP (Integration) SharePoint Slack SQL Server (Relational Database) Stakeholder Management Storage Architecture Storytelling Strategic Thinking Sustainability Awareness Teamwork Technical Governance Time Management TOGAF (Framework) Verbal Communication Written Communication

Posted 22 hours ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley. Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting Jun 16, 2025 Unposting Date Jul 16, 2025 Reporting Manager Title Senior General Manager Show more Show less

Posted 22 hours ago

Apply

8.0 years

28 - 30 Lacs

Chennai

On-site

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 22 hours ago

Apply

3.0 years

4 - 10 Lacs

Chennai

On-site

- 3+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience - Experience defining requirements and using data and metrics to draw business insights - Experience with SQL or ETL - Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages - 1+ years of tax, finance or a related analytical field experience Are you passionate about solving business challenges at a global scale? Amazon Employee Services is looking for an experienced Business Analyst to join Retail Business Services team and help unlock insights which take our business to the next level. The candidate will be excited about understanding and implementing new and repeatable processes to improve our employee global work authorization experiences. They will do this by partnering with key stakeholders to be curious and comfortable digging deep into the business challenges to understand and identify insights that will enable us to figure out standards to improve our ability to globally scale this program. They will be comfortable delivering/presenting these recommended solutions by retrieving and integrating artifacts in a format that is immediately useful to improve the business decision-making process. This role requires an individual with excellent analytical abilities as well as an outstanding business acumen. The candidate knows and values our customers (internal and external) and will work back from the customer to create structured processes for global expansions of work authorization, and help integrate new countries/new acquisitions into the existing program. They are experts in partnering and earning trust with operations/business leaders to drive these key business decisions. Responsibilities: - Own the development and maintenance of new and existing artifacts focused on analysis of requirements, metrics, and reporting dashboards. - Partner with operations/business teams to consult, develop and implement KPI’s, automated reporting/process solutions, and process improvements to meet business needs. - Enable effective decision making by retrieving and aggregating data from multiple sources and compiling it into a digestible and actionable format. - Prepare and deliver business requirements reviews to the senior management team regarding progress and roadblocks. - Participate in strategic and tactical planning discussions. - Design, develop and maintain scaled, automated, user-friendly systems, reports, dashboards, etc. that will support our business needs. - Excellent writing skills, to create artifacts easily digestible by business and tech partners. Key job responsibilities • Design and develop highly available dashboards and metrics using SQL and Excel/Tableau/QuickSight • Understand the requirements of stakeholders and map them with the data sources/data warehouse • Own the delivery and backup of periodic metrics, dashboards to the leadership team • Draw inferences and conclusions, and create dashboards and visualizations of processed data, identify trends, anomalies • Execute high priority (i.e. cross functional, high impact) projects to improve operations performance with the help of Operations Analytics managers • Perform business analysis and data queries using appropriate tools • Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Experience in Amazon Redshift and other AWS technologies Experience creating complex SQL queries joining multiple datasets, ETL DW concepts Experience in SCALA and Pyspark Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 22 hours ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Chennai

On-site

Position: Database Admin- Redshift Purpose of the Position: You will be a critical member of the Infocepts Cloud Data Administrator Team. This position requires a deep understanding of Amazon Redshift, database performance tuning, and optimization techniques. Strong foundation in database concepts, SQL, and experience with AWS services is essential. Location: Nagpur/Pune/Bangalore/Chennai Type of Employment: Full-time Key Result Areas and Activities: Design and Development: Design, implement, and manage Redshift clusters for high availability, performance, and security. Performance Optimization: Monitor and optimize database performance, including query tuning and resource management. Backup and Recovery: Develop and maintain database backup and recovery strategies. Security Enforcement: Implement and enforce database security policies and procedures. Cost-Performance Balance: Ensure an optimal balance between cost and performance. Collaboration with Development Teams: Work with development teams to design and optimize database schemas and queries. Perform database migrations, upgrades, and patching. Issue Resolution: Troubleshoot and resolve database-related issues, providing support to development and operations teams. Automate routine database tasks using scripting languages and tools. Continuous Learning: Stay updated with the latest Redshift features, best practices, and industry trends. Deliver technology-focused training sessions and conduct expert knowledge sharing with client stakeholders as needed. Documentation and Proposals: Assist in designing case study documents and collaborate with Centre of Excellence/Practice teams on proposals. Mentorship and Recruitment: Mentor and groom junior DBAs and participate in conducting interviews for the organization. Value-Added Improvements: Propose improvements to the existing database landscape. Product Team Collaboration: Collaborate effectively with product teams to ensure seamless integration and performance. Essential Skills: Strong understanding of database design, performance tuning, and optimization techniques Proficiency in SQL and experience with database scripting languages (e.g., Python, Shell) Experience with database backup and recovery, security, and high availability solutions Familiarity with AWS services and tools, including S3, EC2, IAM, and CloudWatch Operating System - Any flavor of Linux, Windows Desirable Skills: Knowledge of other database systems (e.g., Snowflake, SingleStore, PostgreSQL, MySQL) AWS Certified Database - Specialty or other relevant certifications Prior experience of working in a large media company would be added advantage Qualifications: Bachelor?s degree in computer science, engineering, or related field (Master?s degree is a plus) 7-10 years of experience as a Database Administrator, with at least 5 years of experience specifically with Amazon Redshift Demonstrated continued learning through one or more technical certifications or related methods Experience with data warehousing concepts and ETL processes Qualities: Should be a quick and self-learner and be ready to adapt to new technologies as and when required Should have the capability to deep dive and research in various technical related fields Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Location India Years Of Exp 7 to 10 years

Posted 22 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies