Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Talend ETL Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will be responsible for identifying and solving issues within multiple components of critical business systems. Your typical day will involve providing support for SAP ABAP Development and ensuring smooth functioning of the system. You will engage with various stakeholders to understand their needs and provide timely solutions, ensuring that all systems operate efficiently and effectively. Your role will also require you to monitor system performance, troubleshoot issues, and implement necessary updates to maintain optimal functionality. Collaboration with team members and other departments will be essential to ensure that all business processes are supported seamlessly. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of processes and procedures to enhance team knowledge.- Provide training and support to junior team members to foster their development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration and transformation processes.- Experience with troubleshooting and resolving application issues.- Familiarity with database management and SQL.- Ability to work collaboratively in a team environment.-Should have knowledge in Java programming language, Oracle, SQL server and MySQL. Additional Information:- The candidate should have minimum 3 years of experience in Talend ETL.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
4.0 - 9.0 years
0 - 2 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning. Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must Exposure to the financial domain knowledge is considered a plus. Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus. Prior experience with State Street and Charles River Development ( CRD) considered a plus. Experience in tools such as Visio, PowerPoint, Excel. Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus. Strong SQL knowledge and debugging skills is a must
Posted 2 weeks ago
2.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Edits new and existing applications. Implements, testing and debugging defined software components. Documents all development activity. Works with moderate guidance in own area of knowledge. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Analyzes and determines integration needs. Evaluates and plans software designs, test results and technical manuals. Reviews literature, patents and current practices relevant to the solution of assigned projects. Programs new software, web applications and supports new applications under development and the customization of current applications. Edits and reviews technical requirements documentation. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays knowledge of engineering methodologies, concepts, skills and their application in the area of specified engineering specialty. Displays knowledge of and ability to apply, process design and redesign skills. Displays in-depth knowledge of and ability to apply, project management skills. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees At All Levels Are Expected To Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality - to help support you physically, financially and emotionally through the big milestones and in your everyday life. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years
Posted 2 weeks ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
The Data Quality Lead for Central Master Data Management (MDM) is responsible for ensuring the quality, consistency, and accuracy of master data across the organization. This role involves developing and enforcing data quality standards, performing data profiling, and leading data quality initiatives for the company’s central MDM system. The Data Quality Lead works closely with business units, IT teams, and data stewards to ensure the data used in key business processes is accurate, reliable, and fit for purpose. The role is pivotal in driving data governance best practices and establishing a centralized approach to managing master data, ensuring compliance with regulatory requirements, and improving decision-making across the enterprise. Data Quality Strategy and Framework: Develop and implement a comprehensive data quality strategy for the Central Master Data Management (MDM) function. Define and document data quality policies, standards, and procedures in alignment with business needs and data governance frameworks. Establish metrics and KPIs to monitor and measure data quality and integrity across master data domains (e.g., customers, vendors, products, financial data). Master Data Quality Assurance: Lead data quality initiatives for central master data management, ensuring the accuracy, completeness, and consistency of master data across systems. Conduct regular data quality assessments, data profiling, and audits to identify gaps, anomalies, or data integrity issues. Develop and manage data validation processes and tools to ensure proper data entry, transformation, and integration. Collaboration with Business & IT: Collaborate with business stakeholders, data stewards, and IT teams to understand data quality requirements, challenges, and improvement opportunities. Work with data owners and stewards to resolve data quality issues and ensure the alignment of data definitions, business rules, and standards across the organization. Provide guidance and training to business users on data quality best practices and data governance principles. Data Profiling & Root Cause Analysis: Perform data profiling and analysis to assess the current state of master data and identify areas for improvement. Data Quality Tools & Technologies: Select, implement, and manage data quality tools and solutions to automate data quality monitoring, reporting, and improvement efforts. Work closely with IT teams to ensure that data quality tools integrate effectively with MDM systems and other data sources. Master Data Integration: Ensure proper integration of master data across different systems and departments, enforcing data standards during migration, transformation, and loading processes. Reporting & Documentation: Develop and maintain data quality dashboards, scorecards, and reports for senior management and key stakeholders. Ensure that all data quality rules, processes, and changes are thoroughly documented and maintained in a central repository. Bachelor’s degree in Information Technology, Data Management, Computer Science, Business, or related field. Data management and governance certifications (e.g., CDMP, DAMA-DMBOK) are preferred. 8+ years of experience in data quality management, master data management (MDM), or data governance. Proven track record of managing data quality for MDM programs across multiple data domains (e.g., customer, product, supplier, finance). Experience with data quality tools preferred (e.g., Informatica Data Quality, Talend, SAP MDG) and MDM systems (e.g., SAP MDG, Oracle, IBM, Microsoft).
Posted 2 weeks ago
4.0 - 7.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Key Responsibilities: Develop and maintain ETL processes using tools such as Talend or DataStage Support data ingestion and transformation pipelines from various source systems into data warehouses and lakes Write, optimize, and troubleshoot SQL queries for data extraction and transformation Work with business and technical teams to understand data requirements and translate them into scalable solutions. Required Skills & Qualifications: 4+ years of experience in data engineering experience Hands-on experience with Talend and SQL. OR Hands-on experience with Java, Spark and SQL. Proficiency in SQL and relational databases (e.g., PostgreSQL, SQL Server, Oracle)
Posted 2 weeks ago
5.0 years
6 - 10 Lacs
Noida
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Become part of Operations, Support, & Maintenance team Need someone who is technical and who can review existing scripts and code to debug, fix code/scripts, enhance code. Should be able to write intermediate level SQL and MongoDB queries. This is needed to support customers with their issues and enhancement requests Support applications/products/platforms during testing & post-production Develop new code / scripts. Not heads-down development! Analyze & report on data, manage data (data validation, data clean up) Monitor scheduled jobs and take proactive actions. Resolve job failures, and communicate to stakeholders Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor's Degree 5+ years of Relational Database experience (Oracle, SQL Server, DB2, etc.) 3+ years of ETL tool experience (Talend, SQL Server SSIS, Informatica, etc.) 3+ years of programming experience (e.g. Java, Javascript, Visual Basic, etc.) 2+ years of experience with NOSQL (e.g. MongoDB) 2+ years of experience in SDLC development process 1+ years of experience in job scheduler (e.g. Rundeck, Tivoli) Thorough understanding of Production control such as change control process Thorough understanding of REST API services Preferred Qualifications: Understanding of the following: Vaults such as CyberArk, Hashi Corp Document Management such as Nuxeo Version control tool (Git) technology Healthcare terminology Atlassian Tools such as JIRA / Bit Bucket / Crowd / Confluence ETL / ELT tool such as FiveTran Understanding of Agile methodology At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Strong SQL knowledge query performance, tuning etc. as well as understanding of database structure. Knowledge of data modeling principles. Knowledge of at least one ETL tool (SSIS, Talend, abinitio etc). Strong analytical skill and attention to detail passionate about complete data structure and problem solving. Ability to pick up new data tools ad concepts quickly. Knowledge and experience in data warehouse environments with knowledge of data architecture patterns prepares and/or direct the creation of system test plans, test criteria and test data. determines system design and prepares work estimates for developments or changes for multiple work efforts. Create or update the system documents (BRD, functional documents), ER-diagram etc. Must have experience to handle and support mid-scale team. good communication skill to handle team and understand the resource issue & fix it faster to improve the team productivity.
Posted 2 weeks ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Data Engineer Work Mode: Hybrid (3 Days from Office – Only 4 Hours Onsite per Day) Location: Gurgaon About the Role BayOne is looking for a skilled Data Engineer to join our dynamic team in Gurgaon. This hybrid role offers flexibility, with just 4 hours per day required in-office , 3 days a week. If you're passionate about building scalable data solutions using Azure and Databricks and thrive in a fast-paced environment, we'd love to hear from you. Key Responsibilities Design and build scalable data pipelines and data lake/warehouse solutions on Azure and Databricks . Work extensively with SQL , schema design, and dimensional data modeling . Develop and maintain ETL/ELT processes using tools like ADF, Talend, Informatica , etc. Leverage Azure Synapse, Azure SQL, Snowflake, Redshift, or BigQuery to manage and optimize data storage and retrieval. Utilize Spark, PySpark, and Spark SQL for big data processing. Collaborate cross-functionally to gather requirements, design solutions, and implement best practices in data engineering. Required Qualifications Minimum 5 years of experience in data engineering, data warehousing, or data lake technologies. Strong experience on Azure cloud platform (preferred over others). Proven expertise in SQL , data modeling, and data warehouse architecture. Hands-on with Databricks, Spark , and proficient programming in PySpark/Spark SQL . Experience with ETL/ELT tools such as Azure Data Factory (ADF) , Talend , or Informatica . Strong communication skills and the ability to thrive in a fast-paced, dynamic environment . Self-motivated, independent learner with a proactive mindset. Nice-to-Have Skills Knowledge of Azure Event Hub , IoT Hub , Stream Analytics , Cosmos DB , and Azure Analysis Services . Familiarity with SAP ECC, S/4HANA, or HANA data sources. Intermediate skills in Power BI , Azure DevOps , CI/CD pipelines , and cloud migration strategies . About BayOne BayOne is a 12-year-old software consulting company headquartered in Pleasanton, California . We specialize in Talent Solutions , helping clients build diverse and high-performing teams. Our mission is to #MakeTechPurple by driving diversity in tech while delivering cutting-edge solutions across: Project & Program Management Cloud & IT Infrastructure Big Data & Analytics Software & Quality Engineering User Experience Design Explore More: 🔗 Company Website 🔗 LinkedIn 🔗 Glassdoor Reviews Join us to shape the future of data-driven decision-making while working in a flexible and collaborative environment.
Posted 2 weeks ago
10.0 - 15.0 years
27 - 30 Lacs
Nagpur, Pune
Work from Office
10+ years in ETL / Data Engineering Strong in SQL, Python, Unix Shell, PL/I Tools: DataStage, Informatica, Databricks, Talend Experience with AWS, Spark, Hadoop Certifications (CSM, CSPO) and MTech in Data Science are a plus Required Candidate profile Strong in SQL, Python, Unix Shell, PL/I Certifications (CSM, CSPO) and MTech in Data Science are a plus
Posted 2 weeks ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Mercer Marsh Benefits Mercer Marsh Benefits is seeking candidates to support its vision of driving value to clients through data and analytics based insights. The following position is based in Mumbai. Mercer Marsh Benefits Analytics – Senior Principal Engineer / Manager - Data Quality Engineer Mercer Marsh Benefits ™ (MMB) is part of the Marsh McLennan family, bringing together a broad spectrum of expertise to help clients navigate the complex world of people risks, cost management and employee benefits. MMB is a global leader in the health and benefits marketplace. Operating in 135 countries, our team of specialists design benefits solutions that meet the needs of businesses and their people, drawing from global intelligence and adapting that wealth of experience to local markets. Mercer Marsh Benefit Analytics is the specialized and technologically advanced data analytics outfit curated to provide data driven insights to our clients in the health and employee benefits space. What can you expect? Joining a rapidly growing organization with an environment that fosters personal and professional development Opportunity to learn new tools and technology. Participate in building a solid data driven foundation for the organization. Work on breakthrough data and analytics products and projects that will create significant value for our clients Opportunity to make a difference to the world by being a part of the health and benefits industry. A chance to work with Industry leaders, global clients and access to latest trends in the industry. What is in it for you? Discover what's great about working at Marsh and McLennan Companies – from the opportunities that our size brings, to our commitment to our communities and understanding the benefits you’ll receive. We are four businesses with one PURPOSE: building the confidence to thrive through the power of perspective As a global leader in insurance broking and risk management, we are devoted to finding diverse individuals who are committed to the success of our clients and our organization. Joining us will provide a solid foundation for you to accelerate your career in the risk and insurance industry. We can promise you extraordinary challenges, extraordinary colleagues, and the opportunity to make a difference. Our rich history has created a client service culture that we believe is second to none. Our commitments to Diversity and Inclusion, Corporate Social Responsibility, and sustainability demonstrate our commitment to stand for what is right. As a Marsh and McLennan Company colleague, you will also receive additional benefits such as: A competitive salary Employee friendly policies Health care and insurance for you and your dependents Healthy Work life balance A great working environment Flexible benefits packages to suit your needs and lifestyle Future career opportunities across a global organization We will count on you to: To bring in your experience to help mature data quality offerings and lead data quality initiatives across MMB business. This role will report to the MMB Analytics Data Governance Leader. Roles and Responsibilities will include – Data Quality Management: Support team in implementing & driving end-to-end data quality framework and operating model using relevant DQ tools & capabilities. Help business in driving governance associated with data quality by enforcing policies & standards at both technical & functional level. Understand the data landscape, data flows and respective regulatory requirements throughout the organization, working with regions to overcome bottlenecks Implement strategies to improve data quality for various analytics products & platforms using methods such as:- data profiling, data cleansing, data enrichment & data validation Work with MMB business to understand & build data quality rules to address end-to-end business and data governance requirements Data Quality Monitoring & Issue Remediation: Develop solutions for automated DQ checks & threshold alerts Responsible for establishing data quality metrics, data quality audits. Support with issue identification, RCA and its remediation. Use data profiling techniques to identify DQ issues leveraging data quality dimensions Providing recommendations and guidelines for periodic data quality checks and drive governance to ensure long-term data trust and delivery of meaningful analytics Create score cards and reporting process to support data governance councils for monitoring progress and tracking blockers. Track KPIs around DQ dimensions Data Quality Implementation: Support projects with end-to-end DQ rules implementation and development activities. Collaboration: Collaborate with data engineers, data analysts, and data scientists to ensure data quality across all stages of the data lifecycle Develop relationships with various markets (APAC, UK & Europe, Latin America, Middle East & Canada) stakeholders to drive & enforce data quality initiatives. Work with data stewards (business & technical) to assess and apply the latest developments in data management and standards. Responsible for maintenance of MMB data & its metadata. Bring clarity in understanding as to what data means, who owns it, where it is stored, what’s its quality etc. Assist with data classification, data retention & disposal policies. Define & translate data security requirements into data policies & rules to meet data privacy requirements. Support stakeholders in improving data literacy and advocate data governance & data quality management adoption What you need to have: 8+ years of hands-on experience in data profiling, DQ rules implementation, DQ issue management, Data Quality Management, DQ metrics reporting & automation activities. Proven experience as a Data Quality Team Lead, Data Quality Engineer or similar role Masters/bachelor’s degree in information sciences/ engineering, or equivalent Strong experience in any enterprise data quality and cataloging tool (Informatica- IDQ ( Informatica Data Quality ), IDMC- CDI/CDQ, CDQ (Cloud Data Quality), CDI (Cloud Data Integration), IICS, Talend, Databricks, etc.) preferred Familiarity with AWS, Azure, or GCP and associated DQ capabilities. Understanding of cloud-based data storage and data pipeline architecture. Familiarity with languages such as SQL, Python, R etc. AI/ML: Basic understanding of AI/ML algorithms can help in building predictive models for data quality. Excellent written & verbal communication skills with demonstrated experience of working with international stakeholders What makes you stand out: Strong functional & technical skills on Data Quality Management Experience of Healthcare/ Insurance industry Experience of building relationships with stakeholders across the globe Demonstrated experience of executing data governance processes & improving data quality for analytics through use of right tools & capabilities
Posted 2 weeks ago
5.0 years
0 Lacs
Nagpur, Maharashtra, India
On-site
Experience- 5+ years Job Role- Sr. Talend Developer Location-Nagpur Details- Design, develop, test, and deploy data integration solutions using Talend ETL tool. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions. Develop complex data pipelines for big data processing and migration projects. Troubleshoot issues related to database connectivity, query optimization, and performance tuning. Ensure timely delivery of projects by managing project schedules, resources, and stakeholders
Posted 2 weeks ago
2.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Denave is looking for a Business Analyst , who has ability turn data into knowledge to deliver best possible solutions to our clients. Job Responsibilities Translate business needs to technical specifications and then, Design, build and deploy BI solutions (e.g. reporting tools) Perform deep-dive data analysis using SQL/Python to extract insights from large datasets Manage marketing and sales campaign analytics and provide performance visibility Should have experience in client interactions and sales campaigns Ability to interpret Data insights to the respective stakeholders Perform hands-on analysis of large volumes of data and across multiple datasets primarily using SQL and/or Python/R. Ability to identify actionable insights by analyzing data in respect to the business focus/needs. Support ad-hoc analysis and requests from business units and leadership Job Requirements B.Tech /BE/ BCA/ BSc in Computer Science, Engineering or relevant field, from reputed Engineering College/ Universities is preferred or Any Graduate Minimum 2 years of experience as a Business Analyst , working with BI and Analytics teams or on client-facing projects , is required. Exposure to large datasets, data cleaning and storytelling through data Excellent communication skills with stakeholder management experience ETL (Informatica, Talend, Terradata, Jasper, etc.), BI Tools (Cognos, BO, Tableau, Power BI, etc.) Familiarity with BI technologies (e.g. Tableau, Microsoft Power BI, Oracle BI) Candidates from Tier1 Collages are preferred
Posted 2 weeks ago
2.0 years
0 Lacs
Hyderabad, Telangana
On-site
Req ID: 334007 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer (Talend &Pyspark) to join our team in Hyderabad, Telangana (IN-TG), India (IN). "Job Duties: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as PySpark, Talend to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Minimum Skills Required: Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as AWS. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in S3 or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred" About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Responsibilities The Sr. Integration Developer (Senior Software Engineer) will work in the Professional Services Team and play a significant role in designing and implementing complex integration solutions using the Adeptia platform. This role requires hands-on expertise in developing scalable and efficient solutions to meet customer requirements. The engineer will act as a key contributor to the team's deliverables while mentoring junior engineers. They will ensure high-quality deliverables by collaborating with cross-functional teams and adhering to industry standards and best practices. Responsibilities include but not limited to: Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Essential Skills: Technical 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft, or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks Desired Skills: Technical Familiarity with JavaScript frameworks like ReactJS, AngularJS, or NodeJS. Exposure to integration standards (EDI, EDIFACT, IDOC). Experience with modern web UI tools and frameworks. Exposure to DevOps tools such as Git, Jenkins, and CI/CD pipelines. Non-Technical Onshore Experience working directly with Customers Strong time management skills and the ability to handle multiple priorities. Detail-oriented and enthusiastic about learning new tools and technologies. Committed to delivering high-quality results. Flexible, responsible, and focused on quality work. Ability to prioritize tasks, work under pressure, and collaborate with cross-functional teams. About Adeptia Adeptia believes business users should be able to access information anywhere, anytime by creating data connections themselves, and its mission is to enable that self-service capability. Adeptia is a unique social network for digital business connectivity for “citizen integrators” to respond quickly to business opportunities and get to revenue faster. Adeptia helps Information Technology (IT) staff to manage this capability while retaining control and security. Adeptia’ s unified hybrid offering — with simple data connectivity in the cloud, and optional on-premises enterprise process-based integration — provides a competitive advantage to 450+ customers, ranging from Fortune 500 companies to small businesses. Headquartered in Chicago, Illinois, USA and with an office in Noida, India, Adeptia provides world-class support to its customers around-the-clock. For more, visit www.adeptia.com Our Locations: India R&D Centre : Office 56, Sixth floor, Tower-B, The Corenthum, Sector-62, Noida, U.P. US Headquarters : 332 S Michigan Ave, Unit LL-A105, Chicago, IL 60604, USA
Posted 2 weeks ago
7.0 years
0 Lacs
India
Remote
Mars Data Hiring full time Data Engineer ( SQL, Talend ETL, GCP & Azure) in remote locations Location: Remote / WFH Rel Experience: 7+ years Job Type: Full-Time Notice Period : Immediate to 15 Days Shift: Mid Shift (IST) We are seeking a highly skilled Senior Data Engineer with 7+ years of experience , specializing in SQL, Talend, Google Cloud Platform (GCP) BigQuery, and Microsoft Azure . The ideal candidate will be responsible for designing, building, and optimizing SQL-driven data pipelines , ensuring high performance, scalability, and data integrity. Required Qualifications · 7+ years of hands-on experience in SQL development, database performance tuning, and ETL processes. · Expert-level proficiency in SQL , including query optimization, stored procedures, indexing, and partitioning. · Strong experience with Talend for ETL/ELT development. · Hands-on experience with GCP Big Query and Azure SQL / Synapse Analytics . · Solid understanding of data modelling (relational & dimensional) and cloud-based data architectures. · Proficiency in Python or Shell scripting for automation and workflow management. · Familiarity with CI/CD, Git, and DevOps best practices for data engineering. Nice to Have · Experience with Apache Airflow or Azure Data Factory for workflow automation. · Knowledge of real-time data streaming (Kafka, Pub/Sub, Event Hubs). · Cloud certifications in GCP or Azure (e.g., Google Professional Data Engineer, Azure Data Engineer Associate). Why Join Us? · Lead and innovate in SQL-driven, cloud-first data solutions . · Work on cutting-edge data engineering projects in a collaborative, agile team. · Opportunities for career growth, leadership, and certifications . Key Responsibilities · Develop and optimize complex SQL queries , stored procedures, and indexing strategies for large datasets. · Design and maintain ETL/ELT data pipelines using Talend , integrating data from multiple sources. · Architect and optimize data storage solutions on GCP BigQuery and Azure SQL / Synapse Analytics. · Implement best practices for data governance, security, and compliance in cloud environments. · Work closely with data analysts, scientists, and business teams to deliver scalable solutions. · Monitor, troubleshoot, and improve data pipeline performance and reliability. · Automate data workflows and scheduling using orchestration tools (e.g., Apache Airflow, Azure Data Factory). · Lead code reviews, mentoring, and best practices for junior engineers. Share your resume to hr@marsdata.in
Posted 2 weeks ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and implement ETL test cases to ensure data accuracy. - Conduct data validation and reconciliation processes. - Collaborate with cross-functional teams to troubleshoot and resolve data issues. - Create and maintain test documentation for ETL processes. - Identify opportunities for process improvement and optimization. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing. - Strong understanding of SQL and database concepts. - Experience with ETL tools such as Informatica or Talend. - Knowledge of data warehousing concepts and methodologies. - Hands-on experience in data quality assurance and testing. Additional Information: - The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing. - This position is based at our Gurugram office. - A 15 years full-time education is required., 15 years full time education
Posted 2 weeks ago
4.0 - 7.0 years
14 - 22 Lacs
Hyderabad
Work from Office
Job Title: Senior / Lead ETL Test Engineer Work Location: Hyderabad (5 Days work from office) Years of Experience: 3 to 12 Yrs Key Responsibilities: Design, develop, and execute test cases for ETL workflows and data pipelines. Validate data transformations and ensure data integrity across source and target systems. Perform unit, system, and integration testing for data migration projects. Collaborate with data engineers and developers to troubleshoot and resolve data issues. Automate test processes to improve efficiency and coverage. Document test results, defects, and provide detailed reports to stakeholders. Ensure compliance with data governance and quality standards. Required Skills: 36 years of experience in ETL testing, data validation, or related roles. Strong proficiency in SQL for data querying and validation. Hands-on experience with ETL tools such as Talend ( preferred ) , Informatica, or data stage. Experience with test automation tools and scripting. Understanding of data migration processes and challenges. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of data warehousing concepts and platforms (e.g., Snowflake). Preferred Skills: Experience with SAS programming and SAS Viya. Exposure to tools like SonarQube, Qlik Replicate, or IBM Data Replicator. Familiarity with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data governance and compliance frameworks.
Posted 2 weeks ago
0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Company Overview Logile is the leading retail labor planning, workforce management, inventory management and store execution provider deployed in thousands of retail locations across North America, Europe, Australia, and Oceania. Our proven AI, machine-learning technology and industrial engineering accelerate ROI and enable operational excellence with improved performance and empowered employees. Retailers worldwide rely on Logile solutions to boost profitability and competitive advantage by delivering the best service and products at optimal cost. From labor standards development and modeling to unified forecasting, storewide scheduling, and time and attendance, to inventory management, task management, food safety, and employee self-service — we transform retail operations with a unified store-level solution. One Platform for store planning, scheduling, and execution. For more information, visit www.logile.com. Data Integration ETL Specialist: Role and Responsibilities Enterprise Data Integration is a key component of the products Logile offers to customers. Our application needs to consume data from various customer systems to provide accurate forecasts, demand, reporting, etc. The Enterprise Data Integration team plays a vital role in integrating this data into our application and serves as the subject matter experts for that data, providing guidance and troubleshooting support to both our customers and other Logile teams. Key Responsibilities As a Data Integration ETL Specialists, you will be responsible for supporting customer data in the Logile application, including but not limited to: Develop and maintain scalable ETL pipelines to support various types of data loads Dev/Unit and QA testing of ETL processes to ensure data accuracy and optimized performance Perform resolution efforts for any issues related to ETL processing Ensuring quality and timely completion of all deliverables. Track and provide updates to manager on milestones and activities The Data Integration ETL Specialist will contribute to Logile’s company growth in a variety of area including but not limited to: Internal training development Offering strategy and growth Peer coaching and development Job Location & Schedule: This job is an onsite job at Logile Bhubaneswar Office. It is expected that the selected candidate will be available to work flexible hours to support US Projects and accounts. Skills & Experience: Proficiency in English. Good verbal and written communication abilities Experience having worked with cross-functional teams on software implementations and SaaS experience. Bachelor's degree in computer Programming, or commensurate work experience Strong expertise in ETL processes including: Data modeling Data mapping Large volume data processing Proficient in SQL and relational databases Familiarity with Application Programming Interface (API) data pipeline integration with the Java or Python languages. Extensive knowledge of different data languages, formats and syntax, including SQL, JavaScript Object Notation (JSON), Extensible Markup Language (XML), etc. Experience with AWS ETL tools like Glue, Data Pipeline and Talend is a plus. Understanding data cleansing, data verification, and data configuration. Preferred familiarity with Jenkins and/or other CI tools experience for monitoring and establishment of data pipelines and task automation. Familiarity with a Retail data ecosystem is a plus Experienced in Agile development
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
Title: L2 Application Support Engineer – Enterprise Data Warehouse (AWS) Job Location: remote Experience 3 -5 yearsNeed immediate joiners Work Schedule:24x7 rotational shifts, weekend support, with on-call availability for high-priority batch or data issues. Key Responsibilities ETL & Data Pipeline Monitoring Monitor daily ETL jobs and workflows using tools like Apache Airflow, AWS Glue, or Talend. Handle data ingestion failures, perform initial triage and escalate to L3 or data engineering if needed. Maintain job runbooks, batch schedules, and alert configurations. Incident & Problem Management Provide L2-level troubleshooting and resolution for data warehouse issues and outages. Log incidents and service requests in ITSM tools (e.g., ServiceNow, Jira). Perform root cause analysis (RCA) and create post-incident reports. Data Quality & Validation Run and validate data quality checks (e.g., nulls, mismatches, record counts). Ensure integrity of data ingestion from source systems (e.g., Finacle, UPI, CRM). Collaborate with business analysts and QA teams to confirm expected outputs. Cloud Operations & Automation Monitor AWS services (Redshift, S3, Glue, Lambda, CloudWatch, Athena) and respond to alerts. Automate recurring support tasks using Python, Shell scripting, or Lambda triggers. Work closely with cloud DevOps or engineering teams for patching, scaling, or performance tuning. Reporting & Documentation Maintain support documentation, knowledge base articles, and daily handover reports. Assist in preparing monthly uptime/SLA reports. Participate in audit reviews and ensure compliance logs are retained. Technical Skills AWS Services: Redshift, S3, Glue, Athena, Lambda, CloudWatch ETL Tools: AWS Glue, Apache Airflow, Talend Scripting: Python, Shell, SQL (advanced) Monitoring: AWS CloudWatch, Grafana, Prometheus (optional) ITSM: ServiceNow, Jira Database: PostgreSQL, Redshift SQL, MySQL Security: IAM policies, data masking, audit logs Soft Skills & Functional Knowledge Good understanding of data warehousing and BI principles. Excellent communication skills to liaise with business, operations, and L3 teams. Analytical thinking and a proactive problem-solving approach. Ability to handle high-pressure production issues in real-time. Preferred Certifications AWS Certified Data Analytics – Specialty (preferred) AWS Certified Solutions Architect – Associate ITIL Foundation (for incident/change processes)
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
Title: L2 Application Support Engineer – AML Platform (Jocata) Job Location: remote Experience 3 -5 yearsNeed immediate joiners preferred within 0-15 days..!! Work Schedule:24x7 rotational shifts, weekend support, and on-call duties for critical escalations or batch failures. Key Responsibilities Technical Support & Maintenance Monitor and manage day-to-day operations of the AML system (Jocata GRID), including process schedulers, ETL jobs, alerts, rule engines, and dashboards. Provide Level 2 troubleshooting and perform root cause analysis (RCA) for issues escalated by L1 or end-users. Work on incident management using ITSM tools (e.g., ServiceNow, Remedy), ensuring resolution within SLA and proper documentation. Coordinate with L3 vendor support (Jocata) for unresolved issues, patch deployments, and hotfixes. Configuration & Rules Management Perform changes to alert thresholds, typologies, and rules as per compliance team requirements. Assist in testing and deployment of AML rule configurations, scenario tuning, and performance impact analysis. Maintain version control of rule sets and workflow configurations using proper DevOps or change control protocols. Data & Security Compliance Ensure that the system adheres to internal data protection and external regulatory standards (FATF, RBI, FIU-IND). Monitor data feeds (KYC, transactions, customer profiles) from core banking, payments, and credit systems. Validate ETL pipelines and data reconciliation processes between Jocata and source systems (e.g., Finacle, UPI, NACH). Monitoring, Reporting & Audit Support Generate or validate AML reports, STRs (Suspicious Transaction Reports), and CTRs (Cash Transaction Reports) for submission to regulators. Work with compliance teams during internal/external audits to retrieve logs, evidence, and report generation. Maintain and improve health-check scripts, monitoring dashboards, and alerting systems (e.g., Grafana, ELK, SQL Monitor). Technical Skills Operating Systems: Linux/Unix shell scripting, Windows Server admin basics Database: Oracle/SQL Server/PostgreSQL (Complex Queries, Joins, Views) ETL Tools: Pentaho / Talend / Custom ETL scripts Monitoring: ELK Stack, Prometheus, Nagios, or in-house tools Scripting: Shell, Python, or PowerShell (for automation/log parsing) ITSM Tools: ServiceNow, Jira Service Desk Regulatory Understanding: AML Typologies, STR/CTR norms, RBI/FIU-IND standards Soft Skills & Functional Knowledge Understanding of banking operations and AML workflow process Ability to document standard operating procedures and knowledge base articles. Certifications Certified Anti-Money Laundering Specialist (CAMS) – optional ITIL Foundation (for incident and change management processes). Jocata GRID hands-on experience or certification (if applicable).
Posted 2 weeks ago
4.0 - 6.0 years
18 - 22 Lacs
Hyderabad, Chennai
Work from Office
Preferred candidate profile This role is primarily responsible for development in Talend and any database Primary Skillset: AWS, ETL (Talen This role is primarily responsible for development in Talend and any database d), SQL, Secondary Skillset: ETL(Datastage),Apache Airflow, Control M, Bigdata, Python,Hadoop and Spark Location Hyderabad and Chennai NoticePeriod Immediate Joiners only Working Mode Hybrid
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend ETL Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL. - Good To Have Skills: Experience with data integration tools and methodologies. - Strong understanding of data warehousing concepts and practices. - Familiarity with SQL and database management systems. - Experience in application testing and debugging techniques. Additional Information: - The candidate should have minimum 3 years of experience in Talend ETL. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Title : Analyst - Coupa GCL : C3 Job Title: Analyst - Coupa Introduction To Role Are you ready to disrupt an industry and change lives? As an Analyst specializing in the Coupa Platform, you'll leverage your technical expertise to support the design, implementation, and integration of this progressive technology. You'll be the technical domain expert, leading change and scaling solutions by collaborating with stake holders. Your work will directly impact our ability to develop life-changing medicines and empower the business to perform at its peak. Accountabilities Technical Ownership: Support Coupa technical solution design & implementation in alignment with design decisions. Participate in design discussions and supply towards decisions. Engage in the full lifecycle of Coupa technical delivery—from concept to design to deployment and post-implementation stabilization. Gather high-level business requirements, perform analysis, define Coupa technology requirements, and design solutions based on completed analysis. Integration & Middleware Oversight Support mapping between legacy and target systems (Coupa), coordinating with middleware and interface teams. Define and validate integration points across systems, applications, and services. Support development and testing of APIs, messaging frameworks, error handling, push-pull mechanisms, and data pipelines. Data Migration Execution Support large-volume data migration activities, including mock runs, cutover rehearsal, and production release support. Ensure data cleansing, mapping rules, and exception handling are well-documented and implemented. Collaborate with business stake holders to define data acceptance criteria and validation plans. DevOps Skills Demonstrate strong knowledge about the Coupa platform and its integrations. Actively assess system enhancements and deploy them in accordance with the latest platform product release. Identify process improvements and implement change with clear outcomes of improvement and standardization. Undertake diagnostic work to understand specific technical issues or problems in greater depth. Manage change management end-to-end and support testing activities by triaging, scenario setting, etc. Deliver platform-based projects to improve adoption of the latest features. Resolve issues by partnering with technical, finance, procurement teams, and vendors. Essential Skills/Experience Coupa certified 6+ years of overall IT experience with good background on Coupa Technical delivery roles Experience with integration technologies such as MuleSoft, Dell Boomi, Azure Integration Services, Kafka, or similar Proficient in ETL tools and practices (e.g., Informatica, Talend, SQL-based scripting) Familiarity with cloud platforms (AWS, Azure, GCP) and hybrid integration strategies Strong problem-solving and analytical skills in technical and data contexts Ability to translate complex technical designs into business-aligned delivery outcomes Leadership in cross-functional and cross-technology environments Effective communicator capable of working with developers, data engineers, testers, and business stake holders Experienced with IT Service Management tools like ServiceNow & Jira Experience in managing and developing 3rd party business relationships Educational Qualifications UG - B. Tech /B.E. or other equivalent technical qualifications When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, you'll be part of a dynamic environment where innovation thrives. Our commitment to innovative science combined with leading digital technology platforms empowers us to make a significant impact. With a spirit of experimentation and collaboration across diverse teams, we drive cross-company change to disrupt the industry. Here, you can explore new technologies, shape your own path, and contribute to developing life-changing medicines. Ready to make a difference? Apply now to join our journey! Date Posted 16-Jul-2025 Closing Date 29-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Become part of Operations, Support, & Maintenance team Need someone who is technical and who can review existing scripts and code to debug, fix code/scripts, enhance code. Should be able to write intermediate level SQL and MongoDB queries. This is needed to support customers with their issues and enhancement requests Support applications/products/platforms during testing & post-production Develop new code / scripts. Not heads-down development! Analyze & report on data, manage data (data validation, data clean up) Monitor scheduled jobs and take proactive actions. Resolve job failures, and communicate to stakeholders Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor's Degree 5+ years of Relational Database experience (Oracle, SQL Server, DB2, etc.) 3+ years of ETL tool experience (Talend, SQL Server SSIS, Informatica, etc.) 3+ years of programming experience (e.g. Java, Javascript, Visual Basic, etc.) 2+ years of experience with NOSQL (e.g. MongoDB) 2+ years of experience in SDLC development process 1+ years of experience in job scheduler (e.g. Rundeck, Tivoli) Thorough understanding of Production control such as change control process Thorough understanding of REST API services Preferred Qualifications Understanding of the following: Vaults such as CyberArk, Hashi Corp Document Management such as Nuxeo Version control tool (Git) technology Healthcare terminology Atlassian Tools such as JIRA / Bit Bucket / Crowd / Confluence ETL / ELT tool such as FiveTran Understanding of Agile methodology At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough