Jobs
Interviews

1515 Talend Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA

Posted 3 days ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Job Title: Data Engineer (4+ Years Experience) Location: Pan India Job Type: Full-Time Experience: 4+ Years Notice Period: Immediate to 30 days preferred Job Summary We are looking for a skilled and motivated Data Engineer with over 4+ years of experience in building and maintaining scalable data pipelines. The ideal candidate will have strong expertise in AWS Redshift and Python/PySpark, with exposure to AWS Glue, Lambda, and ETL tools being a plus. You will play a key role in designing robust data solutions to support analytical and operational needs across the organization. Key Responsibilities Design, develop, and optimize large-scale ETL/ELT data pipelines using PySpark or Python. Implement and manage data models and workflows in AWS Redshift. Work closely with analysts, data scientists, and stakeholders to understand data requirements and deliver reliable solutions. Perform data validation, cleansing, and transformation to ensure high data quality. Build and maintain automation scripts and jobs using Lambda and Glue (if applicable). Ingest, transform, and manage data from various sources into cloud-based data lakes (e.g., S3). Participate in data architecture and platform design discussions. Monitor pipeline performance, troubleshoot issues, and ensure data reliability. Document data workflows, processes, and infrastructure components. Required Skills 4+ years of hands-on experience as a Data Engineer. Strong proficiency in AWS Redshift including schema design, performance tuning, and SQL development. Expertise in Python and PySpark for data manipulation and pipeline development. Experience working with structured and semi-structured data (JSON, Parquet, etc.). Deep knowledge of data warehouse design principles including star/snowflake schemas and dimensional modeling. Good To Have Working knowledge of AWS Glue and building serverless ETL pipelines. Experience with AWS Lambda for lightweight processing and orchestration. Exposure to ETL tools like Informatica, Talend, or Apache Nifi. Familiarity with workflow orchestrators (e.g., Airflow, Step Functions). Knwledge of DevOps practices, version control (Git), and CI/CD pipelines. Preferred Qualifications Bachelor degree in Computer Science, Engineering, or related field. AWS certifications (e.g., AWS Certified Data Analytics, Developer Associate) are a plus.

Posted 3 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

You will drive growth by identifying, developing, and closing new business opportunities while building relationships with potential clients. Understanding their needs will enable you to effectively showcase how the company's IT solutions can add value. As a Business Development Analyst, you will be instrumental in driving the growth and success of the company through strategic analysis and development initiatives. Your responsibilities will include generating new business leads and establishing strong relationships with clients. You will conduct market research and analysis to identify business opportunities, utilizing data analytics tools such as Tableau and SQL to extract insights and trends. Developing financial models and forecasts to support strategic decision-making will be a key aspect of your role. Collaborating with cross-functional teams to implement business development strategies, creating business proposals and presentations, and monitoring industry trends and competitor activities to identify potential risks and opportunities will also be part of your duties. The ideal candidate will have proficiency in project management methodologies, strong analytical skills with experience in data analysis using tools such as VBA, Python, ETL, and Talend. Familiarity with business intelligence tools like Tableau for data visualization, ability to conduct market research, and analyze data to drive business growth are essential. Knowledge of SQL for database querying and manipulation, experience in business analytics, forecasting, and trend analysis are also desired. Having familiarity with watching industry trends for strategic insights is a plus. This is a full-time, permanent position with work location in person.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a Data Architect / Data Engineer with expertise in Qlik Sense Cloud, responsible for designing the organization's data infrastructure, building and maintaining data pipelines, and managing ETL processes for seamless data integration. Your role involves integrating business intelligence tools, supporting data visualization and reporting requirements, and ensuring high-quality outputs from the data analyst teams. Your key responsibilities include designing and implementing data architectures, developing scalable data pipelines and ETL workflows, ensuring data security and governance, managing reporting standards, integrating Qlik Sense Cloud, managing data models for business dashboards, collaborating with stakeholders, optimizing data systems, and maintaining documentation of data architectures and best practices. To excel in this role, you should have deep knowledge of database technologies and big data platforms, experience with ETL tools and cloud data platforms, expertise in Qlik Sense Cloud, proficiency in programming languages, strong collaboration and leadership skills, and the ability to troubleshoot ETL issues and optimize data pipeline performance. If you are passionate about data architecture, data modeling, and delivering actionable insights through scalable reporting solutions, this role at Clinisupplies in Indore is the perfect opportunity for you.,

Posted 3 days ago

Apply

0.0 - 6.0 years

15 - 18 Lacs

Indore, Madhya Pradesh

On-site

Location: Indore Experience: 6+ Years Work Type : Hybrid Notice Period : 0-30 Days joiners We are hiring for a Digital Transformation Consulting firm that specializes in the Advisory and implementation of AI, Automation, and Analytics strategies for the Healthcare providers. The company is headquartered in NJ, USA and its India office is in Indore, MP. Job Description: We are seeking a highly skilled Tech Lead with expertise in database management, data warehousing, and ETL pipelines to drive the data initiatives in the company. The ideal candidate will lead a team of developers, architects, and data engineers to design, develop, and optimize data solutions. This role requires hands-on experience in database technologies, data modeling, ETL processes, and cloud-based data platforms. Key Responsibilities: Lead the design, development, and maintenance of scalable database, data warehouse, and ETL solutions. Define best practices for data architecture, modeling, and governance. Oversee data integration, transformation, and migration strategies. Ensure high availability, performance tuning, and optimization of databases and ETL pipelines. Implement data security, compliance, and backup strategies. Required Skills & Qualifications: 6+ years of experience in database and data engineering roles. Strong expertise in SQL, NoSQL, and relational database management systems (RDBMS). Hands-on experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery). Deep understanding of ETL tools and frameworks (e.g., Apache Airflow, Talend, Informatica). Experience with cloud data platforms (AWS, Azure, GCP). Proficiency in programming/scripting languages (Python, SQL, Shell scripting). Strong problem-solving, leadership, and communication skills. Preferred Skills (Good to Have): Experience with big data technologies (Hadoop, Spark, Kafka). Knowledge of real-time data processing. Exposure to AI/ML technologies and working with ML algorithms Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Schedule: Day shift Application Question(s): We must fill this position urgently. Can you start immediately? Have you held a lead role in the past? Experience: Extract, Transform, Load (ETL): 6 years (Required) Python: 5 years (Required) big data technologies (Hadoop, Spark, Kafka): 6 years (Required) Snowflake: 6 years (Required) Data warehouse: 6 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The role of Lead, Software Engineer at Mastercard involves playing a crucial part in the Data Unification process across different data assets to create a unified view of data from multiple sources. This position will focus on driving insights from available data sets and supporting the development of new data-driven cyber products, services, and actionable insights. The Lead, Software Engineer will collaborate with various teams such as Product Manager, Data Science, Platform Strategy, and Technology to understand data needs and requirements for delivering data solutions that bring business value. Key responsibilities of the Lead, Software Engineer include performing data ingestion, aggregation, and processing to derive relevant insights, manipulating and analyzing complex data from various sources, identifying innovative ideas and delivering proof of concepts, prototypes, and proposing new products and enhancements. Moreover, integrating and unifying new data assets to enhance customer value, analyzing transaction and product data to generate actionable recommendations for business growth, and collecting feedback from clients, development, product, and sales teams for new solutions are also part of the role. The ideal candidate for this position should have a good understanding of streaming technologies like Kafka and Spark Streaming, proficiency in programming languages such as Java, Scala, or Python, experience with Enterprise Business Intelligence Platform/Data platform, strong SQL and higher-level programming skills, knowledge of data mining and machine learning algorithms, and familiarity with data integration tools like ETL/ELT tools including Apache NiFi, Azure Data Factory, Pentaho, and Talend. Additionally, they should possess the ability to work in a fast-paced, deadline-driven environment, collaborate effectively with cross-functional teams, and articulate solution requirements for different groups within the organization. It is essential for all employees working at or on behalf of Mastercard to adhere to the organization's security policies and practices, ensure the confidentiality and integrity of accessed information, report any suspected information security violations or breaches, and complete all mandatory security trainings in accordance with Mastercard's guidelines. The Lead, Software Engineer role at Mastercard offers an exciting opportunity to contribute to the development of innovative data-driven solutions that drive business growth and enhance customer value proposition.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an Integration Solutions Developer, you will be responsible for designing, developing, and implementing integration solutions using various technologies and platforms such as API gateways, ESBs, iPaaS, and message queues. You will analyze business requirements and translate them into technical integration designs, as well as develop and maintain APIs and data transformation processes. Your role will involve configuring and managing integration platforms and tools, monitoring integration flows, troubleshooting issues, and implementing solutions. Collaboration is key in this role, as you will work closely with software developers, system administrators, and business stakeholders to understand integration needs and deliver effective solutions. Ensuring the security, reliability, and scalability of integration solutions will be a priority, along with documenting integration designs, specifications, and processes. Staying updated on the latest integration technologies and best practices is essential, as well as participating in code reviews and providing technical guidance. To be successful in this position, you should have 3-5 years of experience in software development or integration, with proven experience in designing and implementing system integrations. Proficiency in one or more programming languages such as Java, Python, C#, or Node.js is required, along with a strong understanding of API concepts (REST, SOAP) and experience working with APIs. Experience with integration patterns and principles, data formats like XML and JSON, databases, and querying languages (e.g., SQL) is necessary. Troubleshooting and debugging skills in complex integrated environments, as well as excellent communication and collaboration skills, are also important. Preferred qualifications include experience with specific integration platforms or tools like Mulesoft, Dell Boomi, Apache Camel, Talend, Informatica, Azure Integration Services, or AWS Integration Services. Familiarity with message queuing systems (e.g., Kafka, RabbitMQ, ActiveMQ), cloud computing platforms (e.g., AWS, Azure, GCP), containerization (e.g., Docker, Kubernetes), security best practices in integration, CI/CD pipelines for integration deployments, and knowledge of industry standards or protocols relevant to the business. Ideally, you should possess a Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Stay updated on the technology landscape and continuously enhance your skills to deliver effective integration solutions.,

Posted 3 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

YASH Technologies is a leading technology integrator that specializes in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. We pride ourselves on being a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in bringing real positive changes in an increasingly virtual world, driving us beyond generational gaps and future disruptions. We are currently seeking SAP MDM Professionals with the following qualifications: - Experience required: 10+ - Mix of directed and self-directed work - Techno-functional experience - Experience in SAP ECC and S4 (cutovers, ongoing support, governance) - Experience in multiple SAP modules (MM, PP, FI, WH/EWM, QM) and a strong understanding of SAP table structures and purpose - End-to-end data movement in SAP modules focusing on MM, PP, FI, WH/EWM, QM - Experience in Manufacturing - Experience in Kitting would be nice but not required - Working data experience of BOM, Routing, Inspection plan, work center, Inventory, Production version, PIR/SL - Experience with determining relevancy/extraction rules before/for field mapping - Experience with Interfaces is a plus - Lead Data Mapping workshops and conduct follow-up meetings with business and IT Teams - Coordinate with business on specifying data cleansing rules to standardize existing source system data - Ownership of tasks in field mapping, build-out coordination with developer contacts, and data issue resolution with business/dev - Exposure to common industry ETL tools + analysis experience for data quality and exception review to support ETL development and execution - Work with Talend developer to develop load-ready files LRF - Experience working with Functional specs, responsibility for loading data and troubleshooting issues - Coordinate and drive post-load validations with business for data verification - Functional Unit Test planning and execution - Strong communication and people skills are ideal At YASH, you are empowered to create a career that will take you where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - All Support needed for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced Data Migration and Integration engineer, you will be part of the STS group in capital markets, focusing on migrating client data from legacy/3rd party systems into FIS products. You will work alongside specialists in Adeptia, PL/SQL, and SSIS, with solid domain knowledge of Lending, Risk, and Treasury products. In this role, you will be responsible for completing Data Migration/Conversion/Integration projects within the specified time frame and ensuring the highest quality. Your duties will include clear and timely communication, problem escalation, and resolution efforts to ensure project success. To excel in this position, you should hold a Bachelor's degree in Computer Science or related fields such as B.Sc./B.C.A./B.Tech./B.E./M.C.A. with a minimum of 8-10 years of overall experience, primarily in ETL. Your expertise should include 5-6 years of experience with ETL tools like SSIS, Talend, and Adeptia, as well as proficiency in writing PL/SQL or T-SQL programming for Oracle/SQL Server databases. Additionally, you should possess strong knowledge of RDBMS concepts, OLTP system architecture, and analytical programs like Power BI, Crystal Reports/SSRS. Experience with source code control mechanisms, GIT/BitBucket, XML, JSON structures, Jenkins, job scheduling, SOAP, REST, and problem-solving skills are also essential for this role. Strong written and verbal communication, interpersonal skills, and the ability to work independently in high-pressure situations are key attributes. Previous experience in the Banking or Financial Industry is preferred, along with mentorship skills and hands-on experience in languages like Python, Java, or C#. At FIS, you will have the opportunity to learn, grow, and have a significant impact on your career. We offer extensive health benefits, career mobility options, award-winning learning programs, a flexible home-office work model, and the chance to collaborate with global teams and clients. FIS is dedicated to safeguarding the privacy and security of personal information processed for client services. Our recruitment model primarily involves direct sourcing, and we do not accept resumes from agencies not on our preferred supplier list. If you are ready to advance the world of fintech and meet the criteria outlined above, we invite you to join us at FIS.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an Associate Technical Product Analyst - Global Data & Analytics Platform at McDonald's Corporation in Hyderabad, you will be an integral part of the Global Technology Enterprise Products & Platforms (EPP) Team. In this role, you will focus on data management & operations within the Global Data & Analytics Platform (GDAP) to support integrations with core Corporate Accounting/Financial/Reporting applications. Your vision will align with McDonald's goal to be a people-led, product-centric, forward-thinking, and trusted technology partner. Your responsibilities will include supporting the Technical Product Management leadership in technical/IT-related delivery topics such as trade-offs in implementation approaches and tech stack selection. You will provide technical guidance for developers/squad members, manage the output of internal/external squads to ensure adherence to McDonald's standards, participate in roadmap and backlog preparation, and maintain technical process flows and solution architecture diagrams at the product level. Additionally, you will lead acceptance criteria creation, validate development work, support hiring and development of engineers, and act as a technical developer as needed. To excel in this role, you should possess a Bachelor's degree in computer science or engineering, along with at least 3 years of hands-on experience designing and implementing solutions using AWS RedShift and Talend. Experience in data warehouse is a plus, as is familiarity with accounting and financial solutions across different industries. Knowledge of Agile software development processes, collaborative problem-solving skills, and excellent communication abilities are essential for success in this position. Preferred qualifications include proficiency in SQL, data integration tools, and scripting languages, as well as a strong understanding of Talend, AWS Redshift, and other AWS services. Experience with RESTful APIs, microservices architecture, DevOps practices, and tools like Jenkins and GitHub is highly desirable. Additionally, foundational expertise in security standards, cloud architecture, and Oracle cloud security will be advantageous. This full-time role based in Hyderabad, India, offers a hybrid work mode. If you are a detail-oriented individual with a passion for leveraging technology to drive business outcomes and are eager to contribute to a global team dedicated to innovation and excellence, we invite you to apply for the position of Associate Technical Product Analyst at McDonald's Corporation.,

Posted 4 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. The Sr. Analytics Consultant is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Sr. Analytics Consultant will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Responsibilities Include Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyse the customer’s data to spot trends and issues and present the results back to the customer Qualification 5+ years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation in the Application and Interview Process For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com

Posted 4 days ago

Apply

3.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Position Description Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer - Data Analyst Position: Senior Software Engineer - Data Analyst Experience: 3 to 6 Years Main location: India, Telangana, Hyderabad Position ID: J0525-1616 Shift: General Shift (5 Days WFO for initial 8 weeks) Employment Type: Full Time Your future duties and responsibilities Design, develop, and optimize complex SQL queries for data extraction, transformation, and loading (ETL). Work with Teradata databases to perform high-volume data analysis and support enterprise-level reporting needs. Understand business and technical requirements to create and manage Source to Target Mapping (STM) documentation. Collaborate with business analysts and domain SMEs to map banking-specific data such as transactions, accounts, customers, products, and regulatory data. Analyze large data sets to identify trends, data quality issues, and actionable insights. Participate in data migration, data lineage, and reconciliation processes. Ensure data governance, quality, and security protocols are followed. Support testing and validation efforts during system upgrades or new feature implementations. Required Qualifications To Be Successful In This Role Advanced SQL – Joins, subqueries, window functions, performance tuning. Teradata – Query optimization, utilities (e.g., BTEQ, FastLoad, MultiLoad), DDL/DML. Experience with ETL tools (e.g., Informatica, Talend, or custom SQL-based ETL pipelines). Hands-on in preparing STM (Source to Target Mapping) documents. Familiarity with data modeling and data warehouse concepts (star/snowflake schema). Proficient in Excel and/or BI tools (Power BI, Tableau, etc.) for data visualization and analysis. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 4 days ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Noida, Pune, Bengaluru

Work from Office

Position Summary We are looking for a highly skilled and experienced Data Engineering Manager to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership- Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Solution Architecture & Review- Expertise in conceptualizing solution architecture and low-level design in a range of data engineering (Matillion, Informatica, Talend, Python, dbt, Airflow, Apache Spark, Databricks, Redshift) and cloud hosting (AWS, Azure) technologies Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking- Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship- Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Should have expertise and 8+ years of working experience in at least twoETL toolsamong Matillion, dbt, pyspark, Informatica, and Talend Should have expertise and working experience in at least twodatabases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware oftechniques such asData Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Hands-on in scrum methodology (Sprint planning, execution and retrospection) Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Designing technical architecture Agile PySpark AWS Data Pipeline Data Modelling Matillion Databricks Location - Noida,Bengaluru,Pune,Hyderabad,India

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description: Data Engineer Job Summary We are seeking an experienced Data Engineer with 5-8 years of professionalexperience to design, build, and optimize robust and scalable data pipelines for our SmartFM platform. The ideal candidate will be instrumental in ingesting, transforming, and managing vast amounts of operational data from various building devices, ensuring high data quality and availability for analytics and AI/ML applications. This role is critical in enabling our platform to generate actionable insights, alerts, and recommendations for optimizing facility operations. Roles And Responsibilities Design, develop, and maintain scalable and efficient data ingestion pipelines from diverse sources (e.g., IoT devices, sensors, existing systems) using technologies like IBM StreamSets, Azure Data Factory, Apache Spark, Talend Apache Flink and Kafka. Implement robust data transformation and processing logic to clean, enrich, and structure raw data into formats suitable for analysis and machine learning models. Manage and optimize data storage solutions, primarily within MongoDB, ensuring efficient schema design, data indexing, and query performance for large datasets. Collaborate closely with Data Scientists to understand their data needs, provide high-quality, reliable datasets, and assist in deploying data-driven solutions. Ensure data quality, consistency, and integrity across all data pipelines and storage systems, implementing monitoring and alerting mechanisms for data anomalies. Work with cross-functional teams (Software Engineers, Data Scientists, Product Managers) to integrate data solutions with the React frontend and Node.js backend applications. Contribute to the continuous improvement of data architecture, tooling, and best practices, advocating for scalable and maintainable data solutions. Troubleshoot and resolve complex data-related issues, optimizing pipeline performance and ensuring data availability. Stay updated with emerging data engineering technologies and trends, evaluating and recommending new tools and approaches to enhance our data capabilities. Required Technical Skills And Experience 5-8 years of professional experience in Data Engineering or a related field. Proven hands-on experience with data pipeline tools such as IBM StreamSets, Azure Data Factory, Apache Spark, Talend Apache Flink and Apache Kafka. Strong expertise in database management, particularly with MongoDB, including schema design, data ingestion pipelines, and data aggregation. Proficiency in at least one programming language commonly used in data engineering, such as Python or Java/Scala. Experience with big data technologies and distributed processing frameworks (e.g., Apache Spark, Hadoop) is highly desirable. Familiarity with cloud platforms (Azure, AWS, or GCP) and their data services. Solid understanding of data warehousing concepts, ETL/ELT processes, and data modeling. Experience with DevOps practices for data pipelines (CI/CD, monitoring, logging). Knowledge of Node.js and React environments to facilitate seamless integration with existing applications. Additional Qualifications Demonstrated expertise in written and verbal communication, adept at simplifying complex technical concepts for both technical and non-technical audiences. Strong problem-solving and analytical skills with a meticulous approach to data quality. Experienced in collaborating and communicating seamlessly with diverse technology roles, including development, support, and product management. Highly motivated to acquire new skills, explore emerging technologies, and stay updated on the latest trends in data engineering and business needs. Experience in the facility management domain or IoT data is a plus. Education Requirements / Experience Bachelor’s (BE / BTech) / Master’s degree (MS/MTech) in Computer Science, Information Systems, Mathematics, Statistics, or a related quantitative field.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Seeking a Data Governance and Management professional with 7+ years of experience in implementing enterprise governance frameworks. Proficient in Microsoft Purview and Informatica with strong expertise in metadata management, lineage mapping, classification, and security. Skilled in ADF, REST APIs, Talend, dbt, and Azure-based automation, with a solid understanding of GDPR, CCPA, HIPAA, SOX, and other compliance requirements. Adept at bridging technical governance solutions with business and regulatory objectives.

Posted 4 days ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Gurugram

Work from Office

This role involves the development and application of engineering practice and knowledge in designing, managing and improving the processes for Industrial operations, including procurement, supply chain and facilities engineering and maintenance of the facilities. Project and change management of industrial transformations are also included in this role. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 4 days ago

Apply

5.0 - 7.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Acuitys Data and Technology Services group is seeking a skilled Salesforce Data Migration Engineer to lead and execute data migration efforts from legacy systems to Salesforce. The ideal candidate will have deep experience with Salesforce data architecture, ETL tools, and data quality best practices. You will play a critical role in ensuring data integrity, accuracy, and completeness throughout the migration lifecycle. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 3+ years of experience in Salesforce data migration or CRM data migration projects. Proficiency with Salesforce tools (Data Loader, Workbench, etc.) and ETL platforms. Strong understanding of Salesforce data model, objects, relationships, and APIs. Experience with SQL, SOQL, and data transformation scripting. Experience of working with systems and technologies on the financial institutions. Experience working on agile methodology, Jira, Confluence, etc. Good understanding of technology aspects of the working of API, system integration, data transfer, stack, etc. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and also collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Analyze legacy data sources and define data mapping and transformation rules for Salesforce. Design and implement data migration strategies using tools like Data Loader, Talend, Informatica, MuleSoft, or custom scripts. Collaborate with business analysts, developers, and stakeholders to understand data requirements and ensure alignment with business goals. Perform data cleansing, deduplication, and validation to ensure high-quality data in Salesforce. Develop and execute test plans for data migration, including unit testing, system integration testing, and user acceptance testing (UAT). Monitor and troubleshoot data migration issues, ensuring timely resolution. Document data migration processes, mappings, and configurations for future reference and audits. Support post-migration activities including data reconciliation and performance tuning. Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 4 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description and Requirements "At BMC trust is not just a word - it's a way of life!" Hybrid Description and Requirements "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! The IZOT product line includes BMC’s Intelligent Z Optimization & Transformation products, which help the world’s largest companies to monitor and manage their mainframe systems. The modernization of mainframe is the beating heart of our product line, and we achieve this goal by developing products that improve the developer experience, the mainframe integration, the speed of application development, the quality of the code and the applications’ security, while reducing operational costs and risks. We acquired several companies along the way, and we continue to grow, innovate, and perfect our solutions on an ongoing basis. BMC is looking for a Product Owner to join our amazing team! The BMC AMI Cloud Analytics product can quickly transfer, transform, and integrate mainframe data so it could be shared with the organizational data lake to be used by artificial intelligence, machine learning (AI/ML) and analytics solutions. In this role, you will lead the transformation of this cutting-edge product originally developed by Model9, a startup acquired by BMC, into a solution designed to meet the rigorous demands of enterprise customers. This exciting opportunity combines innovation, scalability, and leadership, giving you a chance to shape the product’s evolution as it reaches new heights in enterprise markets. You’ll analyze business opportunities, specify and prioritize customer requirements, and guide product development teams to deliver cutting-edge solutions that resonate with global B2B customers. As a product owner, you will be or become an expert on the product, market, and related business domains. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Lead the transformation of a startup-level solution from Model9 into a robust enterprise-grade product, addressing the complex needs of global organizations. Collaborate with engineering and QA teams to ensure technical feasibility, resolve roadblocks, and deliver solutions that align with customer needs Help plan product deliveries, including documenting detailed requirements, scheduling releases, and publishing roadmaps. Maintaining a strategic backlog of prioritized features. Drive cross-functional collaboration across development, QA, product management, and support teams to ensure seamless product delivery and customer satisfaction. Distil complex business and technical requirements into clear, concise PRD's and prioritized feature backlogs. To ensure you’re set up for success, you will bring the following skillset & experience: 3+ years of software product owner experience in an enterprise/B2B software company, including experience working with global B2B customers Solid technical background (preferably previous experience as a developer or QA) Deep familiarity with public cloud services and storage services (AWS EC2/FSx/EFS/EBS/S3, RDS, Aurora, etc.,) Strong understanding of ETL/ELT solutions and data transformation techniques Knowledge of modern data Lakehouse architectures (e.g., Databricks, Snowflake). B.Sc. in a related field (preferably Software Engineering or similar) or equivalent Experience leading new products and product features through ideation, research, planning, development, go-to-market and feedback cycles Fluent English, spoken and written. Willingness to travel, typically 1-2 times a quarter Whilst these are nice to have, our team can help you develop in the following skills: Background as DBA or system engineer with hands-on experience with commercial and open-source databases like MSSQL, Oracle, PostgreSQL, etc. Knowledge / experience of agile methods (especially lean); familiarity with Aha!, Jira, Confluence. Experience with ETL/ELT tools (e.g., Apache NiFi, Qlik, Precisely, Informatica, Talend, AWS Glue, Azure Data Factory). Understanding of programming languages commonly used on z/OS, such as COBOL, PL/I, REXX, and assembler. Understanding of z/OS subsystems such as JES2/JES3, RACF, DB2, CICS, MQ, and IMS. Experien ce in Cloud-based products and technologies (containerization, serverless approaches, vendor-specific cloud services, cloud security) CA-DNP Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 2,790,000 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply.

Posted 4 days ago

Apply

8.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Location: Delhi Experience: 5–8 Years Industry: Financial Services / Payments Job Summary We are looking for a skilled Data Modeler / Architect with 5–8 years of experience in designing, implementing, and optimizing robust data architectures in the financial payments industry. The ideal candidate will have deep expertise in SQL , data modeling, ETL/ELT pipeline development , and cloud-based data platforms such as Databricks or Snowflake . You will play a key role in designing scalable data models, orchestrating reliable data workflows, and ensuring the integrity and performance of mission-critical financial datasets. This is a highly collaborative role interfacing with engineering, analytics, product, and compliance teams. Key Responsibilities Design, implement, and maintain logical and physical data models to support transactional, analytical, and reporting systems. Develop and manage scalable ETL/ELT pipelines for processing large volumes of financial transaction data . Tune and optimize SQL queries, stored procedures , and data transformations for maximum performance. Build and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi . Architect data lakes and warehouses using platforms like Databricks, Snowflake, BigQuery , or Redshift . Enforce and uphold data governance, security, and compliance standards (e.g., PCI-DSS, GDPR). Collaborate closely with data engineers, analysts, and business stakeholders to understand data needs and deliver solutions. Conduct data profiling, validation , and quality assurance to ensure clean and consistent data. Maintain clear and comprehensive documentation for data models, pipelines, and architecture. Required Skills & Qualifications 5–8 years of experience as a Data Modeler, Data Architect , or Senior Data Engineer in the financial/payments domain. Advanced SQL expertise, including query tuning, indexing , and performance optimization . Proficiency in developing ETL/ELT workflows using tools such as Spark, dbt, Talend, or Informatica . Experience with data orchestration frameworks: Airflow, Dagster, Luigi , etc. Strong hands-on experience with cloud-based data platforms like Databricks, Snowflake , or equivalents. Deep understanding of data warehousing principles : star/snowflake schema, slowly changing dimensions, etc. Familiarity with financial data structures , such as payment transactions, reconciliation, fraud patterns, and audit trails. Working knowledge of cloud services (AWS, GCP, or Azure) and data security best practices . Strong analytical thinking and problem-solving capabilities in high-scale environments. Preferred Qualifications Experience with real-time data pipelines (e.g., Kafka, Spark Streaming). Exposure to data mesh or data fabric architecture paradigms. Certifications in Snowflake, Databricks , or relevant cloud platforms. Knowledge of Python or Scala for data engineering tasks.

Posted 4 days ago

Apply

6.0 years

4 - 6 Lacs

Hyderābād

On-site

Senior Data Modernization Expert Overview We are building a high-impact Data Modernization Center of Excellence (COE) to help clients modernize their data platforms by migrating legacy data warehouses and ETL ecosystems to Snowflake . We are looking for an experienced and highly motivated Data Modernization Architect with deep expertise in Snowflake, Talend, and Informatica . This role is ideal for someone who thrives at the intersection of data engineering, architecture, and business strategy—and can translate legacy complexity into modern, scalable cloud-native solutions . Key Responsibilities Modernization & Migration Lead end-to-end migration of legacy data warehouses (e.g., Teradata, Netezza, Oracle, SQL Server) to Snowflake. Reverse-engineer complex ETL pipelines built in Talend or Informatica , documenting logic and rebuilding using modern frameworks (e.g., DBT, Snowflake Tasks, Streams, Snowpark). Build scalable ELT pipelines using Snowflake-native patterns , improving cost, performance, and maintainability. Design and validate data mapping, transformation logic , and ensure parity between source and target systems . Implement automation wherever possible (e.g., code converters, metadata extractors, migration playbooks). Architecture & Cloud Integration Architect modern data platforms leveraging Snowflake’s full capabilities : Snowpipe, Streams, Tasks, Materialized Views, Snowpark, and Cortex AI. Integrate with cloud platforms (AWS, Azure, GCP) and orchestrate data workflows with Airflow, Cloud Functions, or Snowflake Tasks . Implement secure, compliant architectures with proper use of RBAC, masking, Unity Catalog, SSO , and external integrations. Communication & Leadership Act as a trusted advisor to internal teams and client stakeholders. Present modernization plans, risks, and ROI to both executive and technical audiences . Collaborate with delivery teams, pre-sales teams, and cloud architects to accelerate migration initiatives . Mentor junior engineers and promote standardization, reuse, and COE asset development . Required Experience 6+ years in data engineering or BI/DW architecture. 3+ years of deep, hands-on Snowflake implementation experience. 2+ years of migration experience from Talend and/or Informatica to Snowflake. Strong command of SQL , data modeling , ELT pipeline design, and performance tuning. Practical knowledge of modern orchestration tools (e.g., Airflow , DBT Cloud , Snowflake Tasks ). Familiarity with legacy metadata parsing , parameterized job execution , and parallel processing logic in ETL tools. Good knowledge of cloud data security , data governance, and compliance standards. Strong written and verbal communication skills; capable of explaining technical concepts to CXOs or developers alike . Bonus / Preferred Snowflake certifications: SnowPro Advanced Architect , SnowPro Core . Experience building custom migration tools or accelerators . Hands-on with LLM-assisted code conversion tools . Experience in key verticals like retail, healthcare, or manufacturing . Why Join This Team? Opportunity to be part of a founding core team defining modernization standards. Exposure to cutting-edge Snowflake features and migration accelerators. High-impact role with visibility across sales, delivery, and leadership . Career acceleration through complex problem-solving and ownership .

Posted 4 days ago

Apply

1.0 - 2.0 years

3 - 5 Lacs

Noida, Gurugram, Bengaluru

Work from Office

What you'll do: Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities: What you'll bring : Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements: Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Location - Bengaluru,Gurugram,Noida,Pune.

Posted 4 days ago

Apply

8.0 years

8 - 8 Lacs

Hyderābād

Remote

Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers.bms.com/working-with-us . Position Summary Bristol Myers Squibb is advancing new and next-generation therapies while exploring and investing in technologies to optimize the planning processes to meet future needs. This role will serve as the delivery lead for Supply Planning and help in driving optimized solutions in the planning space while ensuring SLA compliant system support. Key Responsibilities Serve as the supply chain planning Lead of Kinaxis-Rapid Response system in driving optimal solutions and digital capabilities. Responsible for a Rapid response application support work in a team leader capacity. Persuades and negotiates effectively with stakeholders in being able to coordinate actions for achieving the desired outcome. Collaborate with IT and business stakeholders in reviewing business requirements for new capabilities, enhancements, system upgrade or new deployment. Looks to improve processes, structures and knowledge within the team of Rapid Response. Leads in analysing current states, delivers strong recommendations, and able to execute to bring moderately complex solutions to completion.Engage with IT, Supply Chain and Business Insights & Analytics (BI&A) colleagues in exploring cost-effective and sustainable solutions. Actively participate in various meetings with internal and external stakeholders in driving timely closure of project and support activities. Ensure timely resolution of outstanding tickets (bug-fixes, enhancements) as per the SLA guidelines. Monitor and provide system support ensuring system operates under the service level agreement around availability, performance, accuracy & reliability. Perform unit and integration tests and assist with user acceptance testing. Provide system training on new capabilities to the business stakeholders. Qualifications and Experience 8+ years of experience on advanced planning systems as developer, analyst, consultant or end user. Basic understanding of demand, supply and S&OP business processes. Experience with developing, implementing or supporting supply chain planning solutions (esp. Kinaxis Rapid Response - Demand Planning, Supply Planning, Inventory Planning, S&OP). Bachelor's Degree in technical engineering, science field or related discipline is required. Some experience with supply chain planning algorithms such as Heuristic and Optimizer. Exposure to data integration technologies (such as Talend) with SAP-ERP and other non-SAP systems. Agile and critical thinker with a passion for innovation and learning new skills. Excellent verbal, written and interpersonal communication skills; ability to strategically collaborate and influence in the defined area of scope. Ability to easily navigate through multiple tasks and initiatives. Ability to balance strategic awareness & direction setting with consistent tactical results Good planning, problem solving, analytical, time management and organizational skills If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science™ , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com . Visit careers.bms.com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.

Posted 4 days ago

Apply

1.0 - 2.0 years

2 - 4 Lacs

Noida, Gurugram, Bengaluru

Work from Office

What youll do: Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities: What youll bring: Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements: Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Additional Skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies