Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.5 years
0 Lacs
Pune, Maharashtra, India
On-site
Hiring Across all the levels . Location- Pune Preferred/ Gurgaon Mandatory Skillset Validated systems testing expertise Validated Systems Testing and end to end test management expertise + Strong client engagement Validated Systems Testing and end to end test management expertise Responsibilities: Contribute to project’s overall Computer System Validation deliverables. Create and execute test scenarios; select the best testing methodologies, techniques and evaluation criteria for testing. Draft, review and approve validation deliverables such as user requirements, technical design specifications, IQ/OQ/PQ scripts and reports, error logs, configuration document, traceability matrix and document 21 CFR Part 11 and EU Annex 11 compliance. Build automation scripts and help the leads in the designing/configuring the test automation frameworks. Understand the various testing activities: Unit, System, Component, User Acceptance, Performance, Integration and Regression. Participate in user story mapping, sprint planning, estimation, and feature walk-throughs. Experience in leading junior test analysts, assigning and tracking tasks provided to team members. Manage end to end testing / validation lifecycle on applications likes - Solution Manager, JIRA, and HP ALM (desired). Self - motivated, team-oriented individual with strong problem-solving abilities Evaluate risks, assess closure requirements, and process change controls for computerized systems. Define key test processes, best practices, KPIs, collateral. Well verse with ETL or automation testing. Qualifications: Bachelor's/master’s degree in engineering, Science, Medical or related field. Hands-on experience in Computer System Validation of Applications and Infrastructure Qualification. A minimum of 3.5-11 years of experience in computer systems validation and hands on experience within GxP (GCP/GMP regulated environment (FDA, EU, MHRA)). Experience in managing and leading testing related activities. Experience in creating test scenarios and testing protocols (IQ/OQ/PQ/UAT) within the various SDLC or Agile phases as per the standard GxP protocol. In-depth understanding of defect management processes. Strong SQL skills for data validation and development of expected results. Hands on Test Management Tool – JIRA and HP ALM. Experience in defining risk-based strategies for validation of computerized systems and author review end-to-end CSV documentation in accordance with various test plans. Understanding CSV and project related SOP’s and WI’s. Well versed in Good Documentation Practices. Show more Show less
Posted 2 days ago
0.0 - 6.0 years
0 Lacs
Hyderabad, Telangana
On-site
Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person
Posted 2 days ago
15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title : Technical Architect - Data Governance & MDM Experience: 15+ years Location: Mumbai/Pune/Bangalore Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills. 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria 15+ years of total experience. Bachelor’s degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Interested candidates can apply directly. Alternatively, you can also send your resume to ansari.m@atos.net Show more Show less
Posted 2 days ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : PLSQL,SQL Writing,mSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. What We Offer Fully remote internship with flexible working hours. Hands-on experience with real-world database projects. Mentorship from experienced database professionals. Certificate of completion and potential for a full-time opportunity based on performance. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 days ago
5.0 - 10.0 years
19 - 30 Lacs
Bengaluru
Hybrid
Greeting from Altimetrik !!! We are looking for a highly skilled and experienced Data Engineer join our dynamic team. Technical Skills & Qualifications: Strong understanding of data engineering and dimensional design fundamentals, good at SQL, integration (ETL), front-end analysis / data visualization, learns new technologies quickly. Strong in SQL & Hands on coding skills using Python and Shell scripts. Designing and developing ETL pipelines across multiple platforms and tools including Spark, Hadoop and AWS Data Services. Develop, test and maintain high-quality software using Python programming language and Shell Designing and developing schema definitions and support data warehouse/mart to enable integration of disparate data sources from within Intuit and outside, aggregate it and make it available for analysis. Support large data volumes and accommodate flexible provisioning of new sources. Contribute to the design and architecture of project across the data landscape. Participate in the entire software development lifecycle, building, testing and delivering high-quality solutions. Write clean and reusable code that can be easily maintained and scaled. Gathering functional requirements, developing technical specifications, and project & test planning. Work with business users to develop and refine analytical requirements forquantitative data (view-through, clickstream, acquisition, product usage,transactions), qualitative data (survey, market research) and unstructured data(blog, social network). Gathering functional requirements, developing technical specifications, and project & test planning. Familiarity with on-call support and issue fix before SLA breach Experience with Agile Development, SCRUM, or Extreme Programming methodologies. Helps to align to overall strategies and reconcile competing priorities across organization. Educational Qualification: Bachelors degree in Engineering or Masters degree ( PG). Exp : 5 to 10 yrs Mandatory Skills : Must have & Should be strong - SQL , ETL , Spark ,Hive, Data Ware House/Datamart design Python / Scala / Shell Scripting Good in scripting Should have AWS / Azure Good to have - Streaming ( kafka .. etc) Gen-AI Skill Notice period : Immediate joiner or JULY month Joiner If interested , Please share the below details in mail to reach you Email id :sranganathan11494@altimetrik.com Total years of experience: Experience relevant to Pyspark / Python : Relevant experience in SQL : Experience in Datawarehousing: Experience in AWS : Current CTC : Expected CTC: Notice Period: Company name: Contact No: Contact email id: Current Location : Preferred Location : Are you willing to work 2 days Work from office ( Bangalore): Thanks R Sasikala
Posted 2 days ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are a Rakuten Group company, providing global B2B services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! We are a Rakuten Group company, providing global B2B/B2C services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! Role : Technical Program manager You will independently lead cross-organisation programs, influencing the roadmap priorities and technical direction across teams. You will work with stakeholders across the organisation and own the communication of all aspects of the program including surfacing risks and progress towards the goal. You will guide the team towards technical solutions and make trade-off decisions. You will drive program management best practices across the organisation. The role requires closely working with the multiple functional teams (including but not limited to Business, Architects, Engineering, Operation support etc ) in building and maintaining program delivery timelines, unblocking teams, defining, and streamlining cross-functional dependencies along with increasing efficiency and velocity of project execution. You would likely spend most of the days in Agile, Kanban, or other project planning tools and scheduling meetings with relevant stakeholders to make sure projects keep moving forward to deliver a program execution strategy and timeline, as well as regular reporting of project health to stakeholders throughout a project’s life cycle. Team : RBSS Delivery organization Skills and Qualification Upto 15 years of hands-on technical project/program management experience with at least 10+ years of program managing /working in Scrums Must have Telecom Background with exposure on working with Telcom operators / ISP ( B2B, B2C customer solutions ) in software delivery / integration for at least 5+ years in BSS domain. Technology stack : Managed complex Data migration projects involving technologies such as Cloud ( AWS, GCP or compatible ), Microservices, Various DB solution (Oracle, MySQL, Couchbase, Elastic DB, Camunda etc ) ,Data streaming technologies ( such as Kafka) and tools associated with the technology stack Excellent Knowledge of Project Management Methodology and Software Development Life Cycles including Agile with excellent client-facing and internal communication skills. Ability to plan, organize, prioritize, and deliver multiple projects simultaneously. In-depth-knowledge and understanding of Telecom BSS business needs with the ability to establish/maintain high level of customer trust and confidence with Solid organizational skills including attention to detail and multitasking skills. Good to understanding of the challenges associated with BSS business and understanding of high level modules( CRM, Order Management , Revenue mgmt. and Billing services ) Excellent verbal, written, and presentation skills to effectively communicate complex technical and business issues (and solutions) to diverse audiences Strong analytical, planning, and organizational skills with an ability to manage competing demands Always curious about various issues/items. Have passion to learn continuously in a fast- moving environment Strong working knowledge of Microsoft Office, Confluence, JIRA, etc. Good to have: Project Management Professional (PMP) / Certified Scrum Master certification Good to have: knowledge of external solutions integrated with ETL software, Billing, Warehouse/supply chain related migrations projects Key job responsibilities Manage/Streamline the program planning by evaluating the incoming project demand across multiple channels against available capacity Regularly define and review KPI ‘s for proactively seek out new and improved mechanisms for visibility ensuring your program stays aligned with organization objectives Develop and Maintain Kanban boards /workstream dashboards Work with stakeholders during entire life cycle of the program, Execute Project requirements, Prepare detailed project plan, identify risks, manage vendor / vendor resources, measure program metrics and take corrective and preventive actions Ability to adopt Agile best practices ( such as estimation techniques) and define and optimize the processes is essential Coordinate with the product Management team to Plan Features and Stories into sprints, understand business priorities, align required stakeholders to make sure the team is able to deliver the expected outcome Manage Technology Improvements and other enhancements from conceptualization to delivery, have deep understanding of their impact, pros/cons, work through required detail, collaborate with all stakeholders till its successfully deployed in production Manage and Deliver Planned RBSS releases by working with customers .Work with Scrum masters, plan Scrum capacity, manage productivity of the teams Monitoring progress of the software developed by scrum teams, quality of the deliverables Working with engineering & product teams to scope product delivery, define solution strategies and understand development alternatives, as well as support Ensure availability to the team to answer questions and deliver direction. Work across multiple teams and vendors (cross-cutting across programs, business/engineering teams, and/or technologies) to drive delivery strategy & dependency management ensuring active delivery and pro-active communications Forecast and manage infrastructure and Resourcing demand against the operational growth of the platform in collaboration with engineering teams Delivering Agile projects that offer outstanding business value to the users. Supporting the stakeholders in implementing an effective project governance system. “Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs” Show more Show less
Posted 2 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: QA Tester Data Job Type: Full-time Location: On-site - Hyderabad, Pune or New Delhi Job Summary: Join our customer’s team as a dedicated ETL Tester where your expertise will drive the quality and reliability of crucial business data solutions. As an integral part of our testing group, you will focus on ETL Testing while engaging in automation, API, and MDM testing to support robust, end-to-end data validation and integration. We value professionals who demonstrate strong written and verbal communication and a passion for delivering high-quality solutions. Key Responsibilities: Design, develop, and execute comprehensive ETL test cases, scenarios, and scripts to validate data extraction, transformation, and loading processes. Collaborate with data engineers, business analysts, and QA peers to clarify requirements and ensure accurate data mapping, lineage, and transformations. Perform functional, automation, API, and MDM testing to support a holistic approach to quality assurance. Utilize tools such as Selenium to drive automation efforts for repeatable and scalable ETL testing processes. Identify, document, and track defects while proactively communicating risks and issues to stakeholders with clarity and detail. Work on continuous improvement initiatives to enhance test coverage, efficiency, and effectiveness within the ETL testing framework. Create and maintain detailed documentation for test processes and outcomes, supporting both internal knowledge sharing and compliance requirements. Required Skills and Qualifications: Strong hands-on experience in ETL testing, including understanding of ETL tools and processes. Proficiency in automation testing using Selenium or similar frameworks. Experience in API testing, functional testing, and MDM testing. Excellent written and verbal communication skills, with an ability to articulate technical concepts clearly to diverse audiences. Solid analytical and problem-solving abilities to troubleshoot data and process issues. Attention to detail and a commitment to high-quality deliverables. Ability to thrive in a collaborative, fast-paced team environment on-site at Hyderabad. Preferred Qualifications: Prior experience working in large-scale data environments or within MDM projects. Familiarity with data warehousing concepts, SQL, and data migration best practices. ISTQB or related QA/testing certification. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Kochi, Kerala, India
Remote
Job Description 🔹Position: Senior Data Analyst 📍Location: Trivandrum/Kochi / Remote 🕓 Experience: 5+ Years ⌛ Notice Period: Immediate Joiners Only 🛠 Mandatory Skills: SQL, Power BI, Python, Amazon Athena 🔎 Job Purpose We are seeking an experienced and analytical Senior Data Analyst to join our Data & Analytics team. The ideal candidate will have a strong background in data analysis, visualization, and stakeholder communication. You will be responsible for turning data into actionable insights that help shape strategic and operational decisions across the organization. 📍Job Description / Duties & Responsibilities Collaborate with business stakeholders to understand data needs and translate them into analytical requirements. Analyze large datasets to uncover trends, patterns, and actionable insights. Design and build dashboards and reports using Power BI. Perform ad-hoc analysis and develop data-driven narratives to support decision-making. Ensure data accuracy, consistency, and integrity through data validation and quality checks. Build and maintain SQL queries, views, and data models for reporting purposes. Communicate findings clearly through presentations, visualizations, and written summaries. Partner with data engineers and architects to improve data pipelines and architecture. Contribute to the definition of KPIs, metrics, and data governance standards. 📍Job Specification / Skills and Competencies Bachelor's or Master's degree in Statistics, Mathematics, Computer Science, Economics, or a related field. 5+ years of experience in a data analyst or business intelligence role. Advanced proficiency in SQL and experience working with relational databases (e.g. SQL Server, Redshift, Snowflake). Hands-on experience in Power BI. Proficiency in Python, Excel and data storytelling. Understanding of data modelling, ETL concepts, and basic data architecture. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management skills To adhere to the Information Security Management policies and procedures. 📍Soft Skills Required Must be a good team player with good communication skills Must have good presentation skills Must be a pro-active problem solver and a leader by self Manage & nurture a team of data engineers Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins. Why join Coupa? 🔹 Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend. 🔹 Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence. 🔹 Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other. Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa. The Impact of a Lead Software Engineer – Data to Coupa: The Lead Data Engineer plays a critical role in shaping Coupa’s data infrastructure, driving the design and implementation of scalable, high-performance data solutions. Collaborating with teams across engineering, data science, and product, this role ensures the integrity, security, and efficiency of our data systems. Beyond technical execution, the Lead Data Engineer provides mentorship and defines best practices, supporting a culture of excellence. Their expertise will directly support Coupa’s ability to deliver innovative, data-driven solutions, enabling business growth and reinforcing our leadership in cloud-based spend management. What You’ll Do: Lead and drive the development and optimization of scalable data architectures and pipelines. Design and implement best-in-class ETL/ELT solutions for real-time and batch data processing. Optimize data analysis and computation for performance, reliability, and cost efficiency, implementing monitoring solutions to identify bottlenecks. Architect and maintain cloud-based data infrastructure leveraging AWS, Azure, or GCP services. Ensure data security and governance, enforcing compliance with industry standards and regulations. Develop and promote best practices for data modeling, processing, and analytics.Mentor and guide a team of data engineers, fostering a culture of innovation and technical excellence Collaborate with stakeholders, including Product, Engineering, and Data Science teams, to support data-driven decision-making Automate and streamline data ingestion, transformation, and analytics processes to enhance efficiency. Develop real-time and batch data processing solutions, integrating structured and unstructured data sources What you will bring to Coupa: We are looking for a candidate with 10+ years of experience in Data Engineering and Application development with at least 3+ years in a Technical Lead role. They must have a graduate degree in Computer Science or a related field of study. They must have experience with programming languages such as Python and Java. Expertise in Python is a must Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Expertise in processing and analyzing large data workloads. Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs Experience with API development and design with REST or GraphQL. Experience building and optimizing 'big data' data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience with data pipeline and workflow management tools. Experience with AWS cloud services Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees. Please be advised that inquiries or resumes from recruiters will not be accepted. By submitting your application, you acknowledge that you have read Coupa’s Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy. Show more Show less
Posted 2 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greeting from Infosys BPM Ltd., We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 18th & 19th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 18th & 19th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less
Posted 2 days ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Role Title: Data Scientist Location: India Worker Type: Full-Time Employee (FTE) Years of Experience: 8+ years Start Date: Within 2 weeks Engagement Type: Full-time Salary Range: Flexible Remote/Onsite: Hybrid (India-based candidates) Job Overview: We are looking for an experienced Data Scientist to join our team and contribute to developing AIdriven data conversion tools. You will work closely with engineers and business stakeholders to build intelligent systems for data mapping, validation, and transformation. Required Skills and Experience: • Bachelor’s or Master’s in Data Science, Computer Science, AI, or a related field • Strong programming skills in Python and SQL • Experience with ML frameworks like TensorFlow or PyTorch • Solid understanding of AI-based data mapping, code generation, and validation • Familiarity with databases like SQL Server and MongoDB • Excellent collaboration, problem-solving, and communication skills • At least 8 years of relevant experience in Data Science • Open mindset with a willingness to experiment and learn from failures Preferred Qualifications: • Experience in the financial services domain • Certifications in Data Science or AI/ML • Background in data wrangling, ETL, or master data management • Exposure to DevOps tools like Jira, Confluence, BitBucket • Knowledge of cloud and AI/ML tools like Azure Synapse, Azure ML, Cognitive Services, and Databricks • Prior experience delivering AI solutions for data conversion or transformation Show more Show less
Posted 2 days ago
5.0 - 6.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Overview: We are seeking an experienced and organized Team Lead to join our logistics operations in the third-party payroll. The ideal candidate will be responsible for overseeing the daily operations of the logistics team, ensuring efficient order fulfillment, supply chain coordination, and team management. Key Responsibilities: 1. Team Management & Coordination: - Support customer service and delivery activities by coordinating and directing teams handling shipping, receiving, and storage of goods. - Focus on order fulfillment and supply chain coordination while managing team performance. - Ensure high levels of team productivity and adherence to company policies, including safety standards. 2. Operational Efficiency: - Develop strategies to maximize assets in logistics and inventory planning. - Resolve issues impacting operational progress and ensure smooth workflow. - Oversee daily operations, ensuring the team meets set KPIs and brand standards. 3. Scheduling & Payroll Management: - Accomplish resource allocation, scheduling, and coordination of staff to meet operational demands. - Ensure the accuracy of payroll for team members in alignment with working hours and attendance. 4. Technology & Process Management: - Develop custom RF transactions and conversion programs to optimize logistics operations. - Use RF scanners to pull products from stockroom and receiving areas to ensure smooth inventory flow. 5. Team Development: - Promote and mentor team leaders for career growth into higher roles. - Develop interns into leadership positions such as ETL (Extended Team Leaders). 6. Stock Management & Compliance: - Perform daily in-stocks using the PDA system to maintain accurate product counts. - Ensure that all logistics operations comply with company policies, corporate standards, and safety regulations. 7. Leadership & Safety: - Serve as the district assessor for Hardlines to ensure all stores meet corporate standards. - Act as a role model in promoting safety and productivity on the floor. Skills & Qualifications: * Proven experience in team handling within the logistics industry (5-6 years). * Strong organizational, time management, and problem-solving skills. * Proficiency with RF scanners and PDA systems. * Ability to manage multiple tasks and coordinate effectively under pressure. * Excellent interpersonal and leadership skills with a focus on team development. Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At o9 Solutions, our mission is clear: be the Most Valuable Platform (MVP) for enterprises. With our AI-driven platform — the o9 Digital Brain — we integrate global enterprises’ siloed planning capabilities, helping them capture millions and, in some cases, billions of dollars in value leakage. But our impact doesn’t stop there. Businesses that plan better and faster also reduce waste, which drives better outcomes for the planet, too. We're on the lookout for the brightest, most committed individuals to join us on our mission. Along the journey, we’ll provide you with a nurturing environment where you can be part of something truly extraordinary and make a real difference for companies and the plane t What you’ll do for us: Apply a variety of machine learning techniques (clustering, regression, ensemble learning, neural nets, time series, optimizations etc.) to their real-world advantages/drawbacks Develop and/or optimize models for demand sensing/forecasting, optimization (Heuristic, LP, GA etc), Anomaly detection, Simulation and stochastic models, Market Intelligence etc. Use latest advancements in AI/ML to solve business problems Analyze problems by synthesizing complex information, evaluating alternate methods, and articulating the result with the relevant assumptions/reasons Application of common business metrics (Forecast Accuracy, Bias, MAPE) and the ability to generate new ones as needed. Develop or optimize modules to call web services for real time integration with externa systems Work collaboratively with Clients, Project Management, Solution Architects, Consultants and Data Engineers to ensure successful delivery of o9 projects What you’ll have: Experience: 4+ Years of experience in time series forecasting in scale using heuristic-based hierarchical best-fit models using algorithms like exponential smoothing, ARIMA, prophet and custom parameter tuning. Experience in applied analytical methods in the field of Supply chain and planning, like demand planning, supply planning, market intelligence, optimal assortments/pricing/inventory etc. Should be from a statistical background. Education: Bachelors Degree in Computer Science, Mathematics, Statistics, Economics, Engineering or related field Languages: Python and/or R for Data Science Skills: Deep Knowledge of statistical and machine learning algorithms, building scalable ML frameworks, identifying and collecting relevant input data, feature engineering, tuning, and testing. Characteristics: Independent thinkers Strong presentation and communications skills We really value team spirit: Transparency and frequent communication is key. At o9, this is not limited by hierarchy, distance, or function. Nice to have: Experience with SQL, databases and ETL tools or similar is optional but preferred Exposure to distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, Gurobi, or related Big Data technologies Experience with Deep Learning frameworks such as Keras, Tensorflow or PyTorch is preferable Experience in implementing planning applications will be a plus Understanding of Supply Chain Concepts will be preferable Masters Degree in Computer Science, Applied Mathematics, Statistics, Engineering, Business Analytics, Operations, or related field What we’ll do for you Competitive salary with stock options to eligible candidates Flat organization: With a very strong entrepreneurial culture (and no corporate politics) Great people and unlimited fun at work Possibility to make a difference in a scale-up environment. Opportunity to travel onsite in specific phases depending on project requirements. Support network: Work with a team you can learn from everyday. Diversity: We pride ourselves on our international working environment. Work-Life Balance: https://youtu.be/IHSZeUPATBA?feature=shared Feel part of A team: https://youtu.be/QbjtgaCyhes?feature=shared How the process works Apply by clicking the button below You’ll be contacted by our recruiter, who’ll fill you in on all things at o9, give you some background about the role and get to know you. They’ll contact you either via video call or phone call - whatever you prefer. During the interview phase, you will meet with technical panels for 60 minutes. The recruiter will contact you after the interview to let you know if we’d like to progress your application. We will have 2 rounds of Technical discussion followed by a Hiring Manager discussion. Our recruiter will let you know if you’re the successful candidate. Good luck! More about us … With the latest increase in our valuation from $2.7B to $3.7B despite challenging global macroeconomic conditions, o9 Solutions is one of the fastest-growing technology companies in the world today. Our mission is to digitally transform planning and decision-making for the enterprise and the planet. Our culture is high-energy and drives us to aim 10x in everything we do. Our platform, the o9 Digital Brain, is the premier AI-powered, cloud-native platform driving the digital transformations of major global enterprises including Google, Walmart, ABInBev, Starbucks and many others. Our headquarters are located in Dallas, with offices in Amsterdam, Paris, London, Barcelona, Madrid, Sao Paolo, Bengaluru, Tokyo, Seoul, Milan, Stockholm, Sydney, Shanghai, Singapore an d Munich. o9 is an equal opportunity employer and seeks applicants of diverse backgrounds and hires without regard to race, colour, gender, religion, national origin, citizenship, age, sexual orientation or any other characteristic protected by law Show more Show less
Posted 2 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Proficiency in Talend ETL development and integration with Snowflake. Hands-on experience with IBM Data Replicator and Qlik Replicate. Strong knowledge of Snowflake database architecture and Type 2 SCD modeling. Expertise in containerized DB2, DB2, Oracle, and Hadoop data sources. Understanding of Change Data Capture (CDC) processes and real-time data replication patterns. Experience with SQL, Python, or Shell scripting for data transformations and automation. Tools/Skills: Talend, IBM Data Replicator, Qlik Replicate, SQL, Python Skills: ibm,sql,ibm data replicator,snowflake database architecture,etl development,db2,type 2 scd modeling,shell scripting,talend,snowflake,python,change data capture (cdc),hadoop,qlik replicate,data,oracle Show more Show less
Posted 2 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience Level: 5–8 years in testing SAS applications and data pipelines. Proficiency in SAS programming (Base SAS, Macro, SQL) and SQL query validation. Experience with data testing frameworks and tools for data validation and reconciliation. Knowledge of Snowflake and explicit pass-through SQL for data integration testing. Familiarity with Talend, IBM Data Replicator, and Qlik Replicate for ETL pipeline validation. Hands-on experience with test automation tools (e.g., Selenium, Python, or Shell scripts). Skills: data validation,sql query validation,shell scripts,macro,sql,ibm data replicator,etl pipeline validation,pass-through sql,sas programming,base sas,sas,data testing frameworks,talend,snowflake,selenium,python,testing,data reconciliation,qlik replicate,data,test automation tools Show more Show less
Posted 2 days ago
0.0 - 3.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job ID R-225846 Date posted 06/17/2025 Job Title: Senior Manager - Product Quality Engineering Leader Career Level - E Introduction to role: Join our Commercial IT Data Analytics & AI (DAAI) team as a Product Quality Leader, where you will play a pivotal role in ensuring the quality and stability of our data platforms built on AWS services, Databricks, and Snaplogic. Based in Chennai GITC, you will drive the quality engineering strategy, lead a team of quality engineers, and contribute to the overall success of our data platform. Accountabilities : As the Product Quality Team Leader for data platforms, your key accountabilities will include leadership and mentorship, quality engineering standards, collaboration, technical expertise, and innovation and process improvement. You will lead the design, development, and maintenance of scalable and secure data infrastructure and tools to support the data analytics and data science teams. You will also develop and implement data and data engineering quality assurance strategies and plans tailored to data product build and operations. Essential Skills/Experience: Bachelor’s degree or equivalent in Computer Engineering, Computer Science, or a related field Proven experience in a product quality engineering or similar role, with at least 3 years of experience in managing and leading a team. Experience of working within a quality and compliance environment and application of policies, procedures, and guidelines A broad understanding of cloud architecture (preferably in AWS) Strong experience in Databricks, Pyspark and the AWS suite of applications (like S3, Redshift, Lambda, Glue, EMR). Proficiency in programming languages such as Python Experienced in Agile Development techniques and Methodologies. Solid understanding of data modelling, ETL processes and data warehousing concepts Excellent communication and leadership skills, with the ability to collaborate effectively with the technical and non-technical stakeholders. Experience with big data technologies such as Hadoop or Spark Certification in AWS or Databricks. Prior significant experience working in Pharmaceutical or Healthcare industry IT environment. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we are committed to disrupting an industry and changing lives. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak and lead a new way of working, combining cutting-edge science with leading digital technology platforms and data. We dare to lead, applying our problem-solving mindset to identify and tackle opportunities across the whole enterprise. Our spirit of experimentation is lived every day through our events like hackathons. We enable AstraZeneca to perform at its peak by delivering world-class technology and data solutions. Are you ready to be part of a team that has the backing to innovate, disrupt an industry and change lives? Apply now to join us on this exciting journey! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. Senior Manager - Product Quality Engineering Leader Posted date Jun. 17, 2025 Contract type Full time Job ID R-225846 APPLY NOW Why choose AstraZeneca India? Help push the boundaries of science to deliver life-changing medicines to patients. After 45 years in India, we’re continuing to secure a future where everyone can access affordable, sustainable, innovative healthcare. The part you play in our business will be challenging, yet rewarding, requiring you to use your resilient, collaborative and diplomatic skillsets to make connections. The majority of your work will be field based, and will require you to be highly-organised, planning your monthly schedule, attending meetings and calls, as well as writing up reports. Who do we look for? Calling all tech innovators, ownership takers, challenge seekers and proactive collaborators. At AstraZeneca, breakthroughs born in the lab become transformative medicine for the world's most complex diseases. We empower people like you to push the boundaries of science, challenge convention, and unleash your entrepreneurial spirit. You'll embrace differences and take bold actions to drive the change needed to meet global healthcare and sustainability challenges. Here, diverse minds and bold disruptors can meaningfully impact the future of healthcare using cutting-edge technology. Whether you join us in Bengaluru or Chennai, you can make a tangible impact within a global biopharmaceutical company that invests in your future. Join a talented global team that's powering AstraZeneca to better serve patients every day. Success Profile Ready to make an impact in your career? If you're passionate, growth-orientated and a true team player, we'll help you succeed. Here are some of the skills and capabilities we look for. 0% Tech innovators Make a greater impact through our digitally enabled enterprise. Use your skills in data and technology to transform and optimise our operations, helping us deliver meaningful work that changes lives. 0% Ownership takers If you're a self-aware self-starter who craves autonomy, AstraZeneca provides the perfect environment to take ownership and grow. Here, you'll feel empowered to lead and reach excellence at every level — with unrivalled support when you need it. 0% Challenge seekers Adapting and advancing our progress means constantly challenging the status quo. In this dynamic environment where everything we do has urgency and focus, you'll have the ability to show up, speak up and confidently take smart risks. 0% Proactive collaborators Your unique perspectives make our ambitions and capabilities possible. Our culture of sharing ideas, learning and improving together helps us consistently set the bar higher. As a proactive collaborator, you'll seek out ways to bring people together to achieve their best. Responsibilities Job ID R-225846 Date posted 06/17/2025 Job Title: Senior Manager - Product Quality Engineering Leader Career Level - E Introduction to role: Join our Commercial IT Data Analytics & AI (DAAI) team as a Product Quality Leader, where you will play a pivotal role in ensuring the quality and stability of our data platforms built on AWS services, Databricks, and Snaplogic. Based in Chennai GITC, you will drive the quality engineering strategy, lead a team of quality engineers, and contribute to the overall success of our data platform. Accountabilities : As the Product Quality Team Leader for data platforms, your key accountabilities will include leadership and mentorship, quality engineering standards, collaboration, technical expertise, and innovation and process improvement. You will lead the design, development, and maintenance of scalable and secure data infrastructure and tools to support the data analytics and data science teams. You will also develop and implement data and data engineering quality assurance strategies and plans tailored to data product build and operations. Essential Skills/Experience: Bachelor’s degree or equivalent in Computer Engineering, Computer Science, or a related field Proven experience in a product quality engineering or similar role, with at least 3 years of experience in managing and leading a team. Experience of working within a quality and compliance environment and application of policies, procedures, and guidelines A broad understanding of cloud architecture (preferably in AWS) Strong experience in Databricks, Pyspark and the AWS suite of applications (like S3, Redshift, Lambda, Glue, EMR). Proficiency in programming languages such as Python Experienced in Agile Development techniques and Methodologies. Solid understanding of data modelling, ETL processes and data warehousing concepts Excellent communication and leadership skills, with the ability to collaborate effectively with the technical and non-technical stakeholders. Experience with big data technologies such as Hadoop or Spark Certification in AWS or Databricks. Prior significant experience working in Pharmaceutical or Healthcare industry IT environment. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we are committed to disrupting an industry and changing lives. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak and lead a new way of working, combining cutting-edge science with leading digital technology platforms and data. We dare to lead, applying our problem-solving mindset to identify and tackle opportunities across the whole enterprise. Our spirit of experimentation is lived every day through our events like hackathons. We enable AstraZeneca to perform at its peak by delivering world-class technology and data solutions. Are you ready to be part of a team that has the backing to innovate, disrupt an industry and change lives? Apply now to join us on this exciting journey! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. APPLY NOW Explore the local area Take a look at the map to see what’s nearby. Reasons to Join Thomas Mathisen Sales Representative Oslo, Norway Christine Recchio Sales Representative California, United States Stephanie Ling Sales Representative Petaling Jaya, Malaysia What we offer We're driven by our shared values of serving people, society and the planet. Our people make this possible, which is why we prioritise diversity, safety, empowerment and collaboration. Discover what a career at AstraZeneca could mean for you. Lifelong learning Our development opportunities are second to none. You'll have the chance to grow your abilities, skills and knowledge constantly as you accelerate your career. From leadership projects and constructive coaching to overseas talent exchanges and global collaboration programmes, you'll never stand still. Autonomy and reward Experience the power of shaping your career how you want to. We are a high-performing learning organisation with autonomy over how we learn. Make big decisions, learn from your mistakes and continue growing — with performance-based rewards as part of the package. Health and wellbeing An energised work environment is only possible when our people have a healthy work-life balance and are supported for their individual needs. That's why we have a dedicated team to ensure your physical, financial and psychological wellbeing is a top priority. Inclusion and diversity Diversity and inclusion are embedded in everything we do. We're at our best and most creative when drawing on our different views, experiences and strengths. That's why we're committed to creating a workplace where everyone can thrive in a culture of respect, collaboration and innovation.
Posted 2 days ago
10.0 years
0 Lacs
Hyderabad, Telangana
On-site
Job Information Date Opened 06/17/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500081 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Technical Project Manager Location: Hyderabad Employment Type: Full-time Experience: 10+ years Domain: Banking and Insurance We are seeking a Technical Project Manager to lead and coordinate the delivery of data-centric projects. This role bridges the gap between engineering teams and business stakeholders, ensuring the successful execution of technical initiatives, particularly in data infrastructure, pipelines, analytics, and platform integration. Responsibilities: Lead end-to-end project management for data-driven initiatives, including planning, execution, delivery, and stakeholder communication. Work closely with data engineers, analysts, and software developers to ensure technical accuracy and timely delivery of projects. Translate business requirements into technical specifications and work plans. Manage project timelines, risks, resources, and dependencies using Agile, Scrum, or Kanban methodologies. Drive the development and maintenance of scalable ETL pipelines, data models, and data integration workflows. Oversee code reviews and ensure adherence to data engineering best practices. Provide hands-on support when necessary, in Python-based development or debugging. Collaborate with cross-functional teams including Product, Data Science, DevOps, and QA. Track project metrics and prepare progress reports for stakeholders. Requirements Required Qualifications: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or related field. 10+ years of experience in project management or technical leadership roles. Strong understanding of modern data architectures (e.g., data lakes, warehousing, streaming). Experience working with cloud platforms like AWS, GCP, or Azure. Familiarity with tools such as JIRA, Confluence, Git, and CI/CD pipelines. Strong communication and stakeholder management skills. Benefits Company standard benefits.
Posted 2 days ago
4.0 years
0 Lacs
Hyderabad, Telangana
On-site
General information Country India State Telangana City Hyderabad Job ID 44957 Department Development Description & Requirements Infor is a leading ERP solution provider committed to delivering seamless data integration experiences for our customers. Our Data Platform and team are growing and the team focuses on building innovative solutions to connect our ecosystem with industry-standard BI tools, ETL platforms, and databases, ensuring data quality, reliability, and scalability. We are looking for passionate software engineers with expertise in C++ and connector development to join our team. If you enjoy building robust solutions and enabling data-driven decision-making, we want to hear from you! Key Responsibilities Design, develop, and maintain third-party data connectors, including ODBC, JDBC, and native integrations with platforms like Power BI, Tableau, and other BI/ETL tools. Implement high-performance, secure, and scalable data interchange protocols. Collaborate with cross-functional teams to define technical requirements and integration standards. Debug, troubleshoot, and optimize existing connectors for performance and reliability. Work closely with the Quality Assurance team to ensure connectors meet high-quality standards. Stay updated on industry trends, standards, and technologies relevant to data integration. Required Skills and Qualifications 4+ Years .Proficiency in C++ with strong understanding of object-oriented programming and system-level development. Hands-on experience with ODBC and JDBC driver development. Familiarity with data protocols such as REST, SOAP, and GraphQL. Experience integrating with BI platforms like Power BI, Tableau, or Looker. Solid understanding of database systems (SQL and NoSQL) and data interchange formats (JSON, XML). Knowledge of ETL tools and workflows. Excellent problem-solving skills and attention to detail. Preferred Qualifications Experience with other programming languages (e.g., Python, Java, or C#). Knowledge of cloud platforms (AWS, Azure, GCP) and containerization technologies like Docker and Kubernetes. Familiarity with CI/CD pipelines for development and deployment. Exposure to Agile development practices. About Infor Infor is a global leader in business cloud software products for companies in industry specific markets. Infor builds complete industry suites in the cloud and efficiently deploys technology that puts the user experience first, leverages data science, and integrates easily into existing systems. Over 60,000 organizations worldwide rely on Infor to help overcome market disruptions and achieve business-wide digital transformation. For more information visit www.infor.com Our Values At Infor, we strive for an environment that is founded on a business philosophy called Principle Based Management™ (PBM™) and eight Guiding Principles: integrity, stewardship & compliance, transformation, principled entrepreneurship, knowledge, humility, respect, self-actualization. Increasing diversity is important to reflect our markets, customers, partners, and communities we serve in now and in the future. We have a relentless commitment to a culture based on PBM. Informed by the principles that allow a free and open society to flourish, PBM™ prepares individuals to innovate, improve, and transform while fostering a healthy, growing organization that creates long-term value for its clients and supporters and fulfillment for its employees. Infor is an Equal Opportunity Employer. We are committed to creating a diverse and inclusive work environment. Infor does not discriminate against candidates or employees because of their sex, race, gender identity, disability, age, sexual orientation, religion, national origin, veteran status, or any other protected status under the law. If you require accommodation or assistance at any time during the application or selection processes, please submit a request by following the directions located in the FAQ section at the bottom of the infor.com/about/careers webpage.
Posted 2 days ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: This position requires frequent collaboration with developers, architects, data product owners, and source system teams. The ideal candidate is a versatile professional with deep expertise spanning data engineering, software architecture, data analysis, visualization, BI tools, relational databases, and data warehouse architecture across traditional and cloud environments. Experience with emerging AI technologies, including Generative AI, is highly valued. Key Roles and Responsibilities Lead the end-to-end design, architecture, development, testing, and deployment of scalable Data & AI solutions across traditional data warehouses, data lakes, and cloud platforms such as Snowflake, Azure, AWS, Databricks, and Delta Lake. Architect and build secure, scalable software systems, microservices, and APIs leveraging best practices in software engineering, automation, version control, and CI/CD pipelines. Develop, optimize, and maintain complex SQL queries, Python scripts, Unix/Linux shell scripts, and AI/ML pipelines to transform, analyze, and operationalize data and AI models. Incorporate GenAI technologies by evaluating, deploying, fine-tuning, and integrating models to enhance data products and business insights. Translate business requirements into robust data products, including interactive dashboards and reports using Power BI, Tableau, or equivalent BI tools. Implement rigorous testing strategies to ensure reliability, performance, and security throughout the software development lifecycle. Lead and mentor engineering teams, fostering collaboration, knowledge sharing, and upskilling in evolving technologies including GenAI. Evaluate and select optimal technologies for platform scalability, performance monitoring, and cost optimization in both cloud and on-premise environments. Partner cross-functionally with development, operations, AI research, and business teams to ensure seamless delivery, support, and alignment to organizational goals. Key Competencies Extensive leadership and strategic experience in full software development lifecycle and enterprise-scale data engineering projects. Deep expertise in relational databases, data marts, data warehouses, and advanced SQL programming. Strong hands-on experience with ETL processes, Python, Unix/Linux shell scripting, data modeling, and AI/ML pipeline integration. Proficiency with Unix/Linux operating systems and scripting environments. Advanced knowledge of cloud data platforms (Azure, AWS, Snowflake, Databricks, Delta Lake). Solid understanding and practical experience with Traditional & Gen AI technologies including model development, deployment, and integration. Familiarity with big data frameworks and streaming technologies such as Hadoop, Spark, and Kafka. Experience with containerization and orchestration tools including Docker and Kubernetes. Strong grasp of data governance, metadata management, and data security best practices. Excellent analytical, problem-solving, and communication skills to articulate complex technical concepts and business impact. Ability to independently lead initiatives while fostering a collaborative, innovative team culture. Desired knowledge of software engineering best practices and architectural design patterns. Required/Desired Skills RDBMS and Data Warehousing — 12+ years (Required) SQL Programming and ETL — 12+ years (Required) Unix/Linux Shell Scripting — 8+ years (Required) Python or other programming languages — 6+ years (Required) Cloud Platforms (Azure, AWS, Snowflake, Databricks, Delta Lake) — 5+ years (Required) Power BI / Tableau — 5 years (Desired) Generative AI (model development, deployment, integration) — 3+ years (Desired) Big Data Technologies (Hadoop, Spark, Kafka) — 3+ years (Desired) Containerization and Orchestration (Docker, Kubernetes) — 2+ years (Desired) Data Governance and Security — 3+ years (Desired) Software Engineering and Architecture — 4+ years (Desired) Education & Experience Bachelor’s degree (BS/BA) in Computer Science, Scientific Computing, or a related field is desired. Relevant certifications in data engineering, cloud platforms, or AI technologies may be required or preferred. 13+ years of related experience is the minimum; however, the ideal candidate will have extensive experience as outlined above. #DataEngineering Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 2 days ago
0.0 - 7.0 years
0 Lacs
Karnataka
Remote
Role: Senior Analyst Experience: 2 to 7 years Location: Remote, Bengaluru Job Description: We are looking for a Data Analytics and Visualization Engineer who can help our team in the area of Business Intelligence. The candidate will deliver solutions that will drive desired business outcomes using KPIs, scorecards, metrics, and dashboards. An ideal candidate has strong software background, thrives in an entrepreneurial atmosphere, communicates efficiently, and can effectively architect, design and develop enterprise level dashboards and Datawarehouse solutions. Job Responsibilities: Work closely with Stakeholders to understand the requirements and apply analytics and visualization to achieve business objectives Engineer end-to-end solutions including data acquisition, data modeling, table creation and building dashboards and reports. Design and develop Tableau reports and dashboards that will yield actionable insights that present the answers to business questions Code and modify SQL/ETL based on dashboard requirements Analyze unstructured / Semi structured data and derive insights Run ad-hoc analysis for Product and Business Managers using standard query languages and operationalize for repeatable use via Tableau reporting suite Ensure designs support corporate IT strategy, established technical standards, and industry best practices Provide technical guidance to project and is responsible for developing and presenting design artifacts within the business and Development teams Identify project issues/risks and present alternatives to alleviate or resolve Core Competencies Strong quantitative and analytical skills - ability to quickly analyze data to identify key insights and apply them to the business Strong visualization design and development experience with Tableau (and other Business Intelligence tools like PowerBI) Experience leading analysis, architecture, design and development of business intelligence solutions and using next generation data platforms Experience developing test strategies for data-centric applications, in Agile methodologies and in diagnosing complex technical issues Strong understanding of architectural standards and software development methodologies, expertise in industry best practices in data architecture and design Excellent communication skills, including ability to present effectively to both business and technical audiences at all levels of the organization Who You are? Qualifications Bachelor’s degree in Engineering, Computer Science, or related field 6+ years of experience using business intelligence reporting tools, developing data visualizations and mastery of Tableau for the creation and automation of Enterprise Scale dashboards 6+ years of experience writing advanced SQL, performance tuning of BI queries, data modeling, and data mining from multiple sources (SQL, ETL, data warehousing) Experience performance tuning of Tableau Server dashboards to minimize rendering time Experience of data preparation/blending and ETL tools such as Alteryx or Talend or Tableau Prep Good knowledge on Tableau Metadata tables and Postgre SQL Server Reporting Experience in anyone programming languages like Python, R, etc would be a plus. Exposure to Snowflake or any Cloud data warehouse architecture would be an added advantage. Possess a strong foundation in data analytics, an understanding and exposure to data science Strong knowledge of data visualization and data warehouse best practices Certification in Tableau / Alteryx / Snowflake would be a plus Job Snapshot Updated Date 17-06-2025 Job ID J_3750 Location Remote, Karnataka, India Experience 3 - 7 Years Employee Type Permanent
Posted 2 days ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: This position requires frequent collaboration with developers, architects, data product owners, and source system teams. The ideal candidate is a versatile professional with deep expertise spanning data engineering, software architecture, data analysis, visualization, BI tools, relational databases, and data warehouse architecture across traditional and cloud environments. Experience with emerging AI technologies, including Generative AI, is highly valued. Key Roles and Responsibilities Lead the end-to-end design, architecture, development, testing, and deployment of scalable Data & AI solutions across traditional data warehouses, data lakes, and cloud platforms such as Snowflake, Azure, AWS, Databricks, and Delta Lake. Architect and build secure, scalable software systems, microservices, and APIs leveraging best practices in software engineering, automation, version control, and CI/CD pipelines. Develop, optimize, and maintain complex SQL queries, Python scripts, Unix/Linux shell scripts, and AI/ML pipelines to transform, analyze, and operationalize data and AI models. Incorporate GenAI technologies by evaluating, deploying, fine-tuning, and integrating models to enhance data products and business insights. Translate business requirements into robust data products, including interactive dashboards and reports using Power BI, Tableau, or equivalent BI tools. Implement rigorous testing strategies to ensure reliability, performance, and security throughout the software development lifecycle. Lead and mentor engineering teams, fostering collaboration, knowledge sharing, and upskilling in evolving technologies including GenAI. Evaluate and select optimal technologies for platform scalability, performance monitoring, and cost optimization in both cloud and on-premise environments. Partner cross-functionally with development, operations, AI research, and business teams to ensure seamless delivery, support, and alignment to organizational goals. Key Competencies Extensive leadership and strategic experience in full software development lifecycle and enterprise-scale data engineering projects. Deep expertise in relational databases, data marts, data warehouses, and advanced SQL programming. Strong hands-on experience with ETL processes, Python, Unix/Linux shell scripting, data modeling, and AI/ML pipeline integration. Proficiency with Unix/Linux operating systems and scripting environments. Advanced knowledge of cloud data platforms (Azure, AWS, Snowflake, Databricks, Delta Lake). Solid understanding and practical experience with Traditional & Gen AI technologies including model development, deployment, and integration. Familiarity with big data frameworks and streaming technologies such as Hadoop, Spark, and Kafka. Experience with containerization and orchestration tools including Docker and Kubernetes. Strong grasp of data governance, metadata management, and data security best practices. Excellent analytical, problem-solving, and communication skills to articulate complex technical concepts and business impact. Ability to independently lead initiatives while fostering a collaborative, innovative team culture. Desired knowledge of software engineering best practices and architectural design patterns. Required/Desired Skills RDBMS and Data Warehousing — 12+ years (Required) SQL Programming and ETL — 12+ years (Required) Unix/Linux Shell Scripting — 8+ years (Required) Python or other programming languages — 6+ years (Required) Cloud Platforms (Azure, AWS, Snowflake, Databricks, Delta Lake) — 5+ years (Required) Power BI / Tableau — 5 years (Desired) Generative AI (model development, deployment, integration) — 3+ years (Desired) Big Data Technologies (Hadoop, Spark, Kafka) — 3+ years (Desired) Containerization and Orchestration (Docker, Kubernetes) — 2+ years (Desired) Data Governance and Security — 3+ years (Desired) Software Engineering and Architecture — 4+ years (Desired) Education & Experience Bachelor’s degree (BS/BA) in Computer Science, Scientific Computing, or a related field is desired. Relevant certifications in data engineering, cloud platforms, or AI technologies may be required or preferred. 13+ years of related experience is the minimum; however, the ideal candidate will have extensive experience as outlined above. #DataEngineering Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. Job ID R-70883 Date posted 06/17/2025 Benefits Your needs? Met. Your wants? Considered. Take a look at our comprehensive benefits. Paid Time Off Tuition Assistance Insurance Options Discounts Training & Development
Posted 2 days ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
We are committed to simplify HR processes through digital transformation and simplification. We believe in harnessing the technology to enhance the employee experience and drive organizational success. As an HR Digitization and Simplification Specialist, you will play a pivotal role in shaping our digital HR landscape and streamlining operations for maximum efficiency and effectiveness. Key Responsibilities: Digital HR Strategy Development and Implementation: Collaborate with cross-functional teams to develop and execute a comprehensive HR digitization and simplification strategy aligned with organizational goals. Identify opportunities to leverage technology for process optimization, automation, and enhanced data analytics. HR Systems Evaluation and Integration: Conduct thorough assessments of existing HR systems, tools, and platforms. Lead efforts to integrate and optimize HRIS, ATS, LMS, and other relevant software solutions. Ensure seamless data flow between systems to support unified HR operations. Process Streamlining and Standardization: Analyze current HR processes and identify areas for simplification and standardization. Develop and implement standardized workflows, ensuring consistency across the organization. Continuously monitor and refine processes to drive operational efficiency. Change Management and Training: Act as a change agent to promote a digital mindset within the HR team and across the organization. Develop and deliver training programs to upskill HR staff on new tools, systems, and processes. Compliance and Security: Ensure HR digitization efforts comply with relevant data protection laws and regulations. Implement security measures to safeguard sensitive HR information. Stakeholder Engagement and Communication: Collaborate with HR leadership to effectively communicate the benefits and progress of digitization initiatives to stakeholders. Foster a culture of transparency and open communication regarding HR digitization efforts. Qualifications Degree, preferably in HR, Business, engineering or other analytical and/or technology-related fields, with high academic achievement required; advanced degree preferred Preferred: Proficiency in HR technology platforms, such as [Oracle HCM Cloud,Workday,Taleo,Icims,Service Now...]Advanced knowledge of Automation tools like Power Automate, Phyton, R, and other Programming Languages or tools necessary to implement Digitization/Automation and Simplification. Problem-solving, communication, and interpersonal ability to anticipate, identify, and solve critical problems Demonstrates keen attention to detail and rigorous data management practices Knowledge with large, complex data sets Knowledge in ETL:Access Excel templates & Tableau Prep, Database: MS Access, Reporting & Analytics: Tableau will be a Plus Management and business development of existing and new solutions Must maintain confidentiality of highly sensitive information Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Aug 31, 2025, 10:59:00 AM
Posted 2 days ago
0.0 - 12.0 years
0 Lacs
Pune, Maharashtra
On-site
You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10363280 Date posted 06/17/2025 End Date 06/30/2025 City Pune State/Region Maharashtra Country India Additional Locations Noida, Uttar Pradesh Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Tech Lead, Data Architecture What Does a great Data Architecture do at Fiserv? We are seeking a seasoned Data Architect with extensive experience in data modeling and architecting data solutions, particularly with Snowflake. The ideal candidate will have 8-12 years of hands-on experience in designing, implementing, and optimizing data architectures to meet the evolving needs of our organization. As a Data Architect, you will play a pivotal role in ensuring the robustness, scalability, and efficiency of our data systems. What you will do: Data Architecture Design: Develop, optimize, and oversee conceptual and logical data systems, ensuring they meet both current and future business requirements. Data Modeling: Create and maintain data models using Snowflake, ensuring data integrity, performance, and security. Solution Architecture: Design and implement end-to-end data solutions, including data ingestion, transformation, storage, and access. Stakeholder Collaboration: Work closely with business stakeholders, data scientists, and engineers to understand data requirements and translate them into technical specifications. Performance Optimization: Monitor and improve data system performance, addressing any issues related to scalability, efficiency, and data quality. Governance and Compliance: Ensure data architectures comply with data governance policies, standards, and industry regulations. Technology Evaluation: Stay current with emerging data technologies and assess their potential impact and value to the organization. Mentorship and Leadership: Provide technical guidance and mentorship to junior data architects and engineers, fostering a culture of continuous learning and improvement. What you will need to have: 8-12 Years of Experience in data architecture and data modeling in Snowflake. Proficiency in Snowflake data warehousing platform. Strong understanding of data modeling concepts, including normalization, denormalization, star schema, and snowflake schema. Experience with ETL/ELT processes and tools. Familiarity with data governance and data security best practices. Knowledge of SQL and performance tuning for large-scale data systems. Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills, with the ability to translate technical concepts for non-technical stakeholders. Demonstrated ability to lead and mentor technical teams. What would be nice to have: Education: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certifications: Snowflake certifications or other relevant industry certifications. Industry Experience: Experience in Finance/Cards/Payments industry Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 2 days ago
0.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job ID R-229455 Date posted 06/17/2025 Job Title: Consultant - Platform Engineer Career Level - C3 Introduction to role AstraZeneca is seeking an IT Integration Engineer to join our R&D IT Team. This role involves managing and maintaining our GxP-compliant product integrations, which are part of our Clinical Development Platforms and used across all therapeutic areas. As a member of our OCD Integration team, you will collaborate with Product Leads, DevOps Leads, and technical engineers to drive innovation and efficiency. Accountabilities Key Responsibilities: Build Integration pipelines in alignment with Standard architectural patterns. Create reusable artefacts wherever possible. Build User Guides and Best-practice for Tool adoption and usage. Utilize vendor-based products to build optimal solutions. Provide full-lifecycle tooling guidance and reusable artefacts to product teams. Collaborate with Integration Lead and vendor teams to build and manage integrations. Participate in continuous improvement discussions with business and IT stakeholders as part of the scrum team. Solve day-to-day BAU Tickets. Essential Skills/Experience At least 4-6 years of experience & hands-on in Snaplogic & its associated Snap Packs. At least 2+ years of experience related to API Terminologies and API Integration – preferably Mulesoft. Excellent SQL knowledge pertaining to Relational databases including RDS, Redshift, PostGres, DB2, Microsoft SQL Server, DynamoDB. Good understanding of ETL pipeline design. Adherence to IT Service Delivery Framework on AD activities. Ability to organize and prioritize work, meet deadlines, and work independently. Strong problem-solving skills. Experience with process tools (JIRA, Confluence). Ability to work independently and collaborate with people across the globe with diverse cultures and backgrounds. Experience working in agile teams using methodologies such as SCRUM, Kanban, and SAFe. Experience in integrating CI/CD processes into existing Change & Configuration Management scope (i.e., ServiceNow & Jira). Desirable Skills/Experience ITIL practices (change management, incident and problem management, and others). Experience in GxP or SOx regulated environments. Proficiency in developing, deploying, and debugging cloud-based applications using AWS. Exposure to AWS Cloud Engineering and CI/CD tools (such as Ansible, GitHub Actions, Jenkins). Exposure to Infrastructure As Code (CloudFormation, Terraform). Good understanding of AWS networking and security configuration. Passion for learning, innovating, and delivering valuable software to people. Experience with dynamic dashboards (e.g., PowerBI). Experience in Python programming. - Experience with Snaplogic & Mulesoft integration platforms - Administration (nice to have). When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we leverage technology to impact patients and ultimately save lives. We are a purpose-led global organization that pushes the boundaries of science to discover and develop life-changing medicines. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak by combining cutting-edge science with leading digital technology platforms and data. Join us at a crucial stage of our journey in becoming a digital and data-led enterprise. Ready to make a difference? Apply now! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. Consultant - Platform Engineer Posted date Jun. 17, 2025 Contract type Full time Job ID R-229455 APPLY NOW Why choose AstraZeneca India? Help push the boundaries of science to deliver life-changing medicines to patients. After 45 years in India, we’re continuing to secure a future where everyone can access affordable, sustainable, innovative healthcare. The part you play in our business will be challenging, yet rewarding, requiring you to use your resilient, collaborative and diplomatic skillsets to make connections. The majority of your work will be field based, and will require you to be highly-organised, planning your monthly schedule, attending meetings and calls, as well as writing up reports. Who do we look for? Calling all tech innovators, ownership takers, challenge seekers and proactive collaborators. At AstraZeneca, breakthroughs born in the lab become transformative medicine for the world's most complex diseases. We empower people like you to push the boundaries of science, challenge convention, and unleash your entrepreneurial spirit. You'll embrace differences and take bold actions to drive the change needed to meet global healthcare and sustainability challenges. Here, diverse minds and bold disruptors can meaningfully impact the future of healthcare using cutting-edge technology. Whether you join us in Bengaluru or Chennai, you can make a tangible impact within a global biopharmaceutical company that invests in your future. Join a talented global team that's powering AstraZeneca to better serve patients every day. Success Profile Ready to make an impact in your career? If you're passionate, growth-orientated and a true team player, we'll help you succeed. Here are some of the skills and capabilities we look for. 0% Tech innovators Make a greater impact through our digitally enabled enterprise. Use your skills in data and technology to transform and optimise our operations, helping us deliver meaningful work that changes lives. 0% Ownership takers If you're a self-aware self-starter who craves autonomy, AstraZeneca provides the perfect environment to take ownership and grow. Here, you'll feel empowered to lead and reach excellence at every level — with unrivalled support when you need it. 0% Challenge seekers Adapting and advancing our progress means constantly challenging the status quo. In this dynamic environment where everything we do has urgency and focus, you'll have the ability to show up, speak up and confidently take smart risks. 0% Proactive collaborators Your unique perspectives make our ambitions and capabilities possible. Our culture of sharing ideas, learning and improving together helps us consistently set the bar higher. As a proactive collaborator, you'll seek out ways to bring people together to achieve their best. Responsibilities Job ID R-229455 Date posted 06/17/2025 Job Title: Consultant - Platform Engineer Career Level - C3 Introduction to role AstraZeneca is seeking an IT Integration Engineer to join our R&D IT Team. This role involves managing and maintaining our GxP-compliant product integrations, which are part of our Clinical Development Platforms and used across all therapeutic areas. As a member of our OCD Integration team, you will collaborate with Product Leads, DevOps Leads, and technical engineers to drive innovation and efficiency. Accountabilities Key Responsibilities: Build Integration pipelines in alignment with Standard architectural patterns. Create reusable artefacts wherever possible. Build User Guides and Best-practice for Tool adoption and usage. Utilize vendor-based products to build optimal solutions. Provide full-lifecycle tooling guidance and reusable artefacts to product teams. Collaborate with Integration Lead and vendor teams to build and manage integrations. Participate in continuous improvement discussions with business and IT stakeholders as part of the scrum team. Solve day-to-day BAU Tickets. Essential Skills/Experience At least 4-6 years of experience & hands-on in Snaplogic & its associated Snap Packs. At least 2+ years of experience related to API Terminologies and API Integration – preferably Mulesoft. Excellent SQL knowledge pertaining to Relational databases including RDS, Redshift, PostGres, DB2, Microsoft SQL Server, DynamoDB. Good understanding of ETL pipeline design. Adherence to IT Service Delivery Framework on AD activities. Ability to organize and prioritize work, meet deadlines, and work independently. Strong problem-solving skills. Experience with process tools (JIRA, Confluence). Ability to work independently and collaborate with people across the globe with diverse cultures and backgrounds. Experience working in agile teams using methodologies such as SCRUM, Kanban, and SAFe. Experience in integrating CI/CD processes into existing Change & Configuration Management scope (i.e., ServiceNow & Jira). Desirable Skills/Experience ITIL practices (change management, incident and problem management, and others). Experience in GxP or SOx regulated environments. Proficiency in developing, deploying, and debugging cloud-based applications using AWS. Exposure to AWS Cloud Engineering and CI/CD tools (such as Ansible, GitHub Actions, Jenkins). Exposure to Infrastructure As Code (CloudFormation, Terraform). Good understanding of AWS networking and security configuration. Passion for learning, innovating, and delivering valuable software to people. Experience with dynamic dashboards (e.g., PowerBI). Experience in Python programming. - Experience with Snaplogic & Mulesoft integration platforms - Administration (nice to have). When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we leverage technology to impact patients and ultimately save lives. We are a purpose-led global organization that pushes the boundaries of science to discover and develop life-changing medicines. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak by combining cutting-edge science with leading digital technology platforms and data. Join us at a crucial stage of our journey in becoming a digital and data-led enterprise. Ready to make a difference? Apply now! AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. APPLY NOW Explore the local area Take a look at the map to see what’s nearby. Reasons to Join Thomas Mathisen Sales Representative Oslo, Norway Christine Recchio Sales Representative California, United States Stephanie Ling Sales Representative Petaling Jaya, Malaysia What we offer We're driven by our shared values of serving people, society and the planet. Our people make this possible, which is why we prioritise diversity, safety, empowerment and collaboration. Discover what a career at AstraZeneca could mean for you. Lifelong learning Our development opportunities are second to none. You'll have the chance to grow your abilities, skills and knowledge constantly as you accelerate your career. From leadership projects and constructive coaching to overseas talent exchanges and global collaboration programmes, you'll never stand still. Autonomy and reward Experience the power of shaping your career how you want to. We are a high-performing learning organisation with autonomy over how we learn. Make big decisions, learn from your mistakes and continue growing — with performance-based rewards as part of the package. Health and wellbeing An energised work environment is only possible when our people have a healthy work-life balance and are supported for their individual needs. That's why we have a dedicated team to ensure your physical, financial and psychological wellbeing is a top priority. Inclusion and diversity Diversity and inclusion are embedded in everything we do. We're at our best and most creative when drawing on our different views, experiences and strengths. That's why we're committed to creating a workplace where everyone can thrive in a culture of respect, collaboration and innovation.
Posted 2 days ago
2.0 years
0 Lacs
Port Blair, Andaman and Nicobar Islands, India
On-site
Job Title: Database Developer Location: Calicut, Kerala (On-site) Experience: Minimum 2 Years Job Type: Full-time Notice: immediate/15 days Candidates from Kerala are highly preferred. Job Summary: We are hiring a skilled and detail-oriented Database Developer with at least 2+ years of experience to join our team in Calicut. The ideal candidate will have hands-on expertise in SQL and PostgreSQL, with a strong understanding of database design, development, and performance optimization. Experience with Azure cloud services is a plus. Key Responsibilities: Design, develop, and maintain database structures, stored procedures, functions, and triggers Write optimized SQL queries for integration with applications and reporting tools Ensure data integrity, consistency, and security across platforms Monitor and tune database performance for high availability and scalability Collaborate with developers and DevOps teams to support application development Maintain and update technical documentation related to database structures and processes Assist in data migration and backup strategies Work with cloud-based databases and services (preferably on Azure) Required Skills & Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field Minimum 2 years of experience as a Database Developer or similar role Strong expertise in SQL and PostgreSQL database development Solid understanding of relational database design and normalization Experience in writing complex queries, stored procedures, and performance tuning Familiarity with version control systems like Git Strong analytical and troubleshooting skills Preferred Qualifications: Experience with Azure SQL Database, Data Factory, or related services Knowledge of data warehousing and ETL processes Exposure to NoSQL or other modern database technologies is a plus Show more Show less
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.