Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) The Prime Data Engineering & Analytics (PDEA) team is seeking to hire passionate Data Engineers to build and manage the central petabyte-scale data infrastructure supporting the worldwide Prime business operations. At Amazon Prime, understanding customer data is paramount to our success in providing customers with relevant and enticing benefits such as fast free shipping, instant videos, streaming music and free Kindle books in the US and international markets. At Amazon you will be working in one of the world's largest and most complex data environments. You will be part of team that will work with the marketing, retail, finance, analytics, machine learning and technology teams to provide real time data processing solution that give Amazon leadership, marketers, PMs timely, flexible and structured access to customer insights. The team will be responsible for building this platform end to end using latest AWS technologies and software development principles. As a Data Engineer, you will be responsible for leading the architecture, design and development of the data, metrics and reporting platform for Prime. You will architect and implement new and automated Business Intelligence solutions, including big data and new analytical capabilities that support our Development Engineers, Analysts and Retail business stakeholders with timely, actionable data, metrics and reports while satisfying scalability, reliability, accuracy, performance and budget goals and driving automation and operational efficiencies. You will partner with business leaders to drive strategy and prioritize projects and feature sets. You will also write and review business cases and drive the development process from design to release. In addition, you will provide technical leadership and mentoring for a team of highly capable Data Engineers. Responsibilities 1. Own design and execution of end to end projects 2. Own managing WW Prime core services data infrastructure 3. Establish key relationships which span Amazon business units and Business Intelligence teams 4. Implement standardized, automated operational and quality control processes to deliver accurate and timely data and reporting to meet or exceed SLAs Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region youβre applying in isnβt listed, please contact your Recruiting Partner.
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title : Data Testing Engineer Exp : 8+ years Location : Hyderabad and Gurgaon (Hybrid) Notice Period : Immediate to 15 days Job Description : Develop, maintain, and execute test cases to validate the accuracy, completeness, and consistency of data across different layers of the data warehouse. β Test ETL processes to ensure that data is correctly extracted, transformed, and loaded from source to target systems while adhering to business rules β Perform source-to-target data validation to ensure data integrity and identify any discrepancies or data quality issues. β Develop automated data validation scripts using SQL, Python, or testing frameworks to streamline and scale testing efforts. β Conduct testing in cloud-based data platforms (e.g., AWS Redshift, Google BigQuery, Snowflake), ensuring performance and scalability. β Familiarity with ETL testing tools and frameworks (e.g., Informatica, Talend, dbt). β Experience with scripting languages to automate data testing. β Familiarity with data visualization tools like Tableau, Power BI, or Looker Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Fully remote possible Full Time 1 May 2025 Title Senior Data Engineer Job Description A Data Engineer is responsible for designing, building, and maintaining large-scale data systems and infrastructure. Their primary goal is to ensure that data is properly collected, stored, processed, and retrieved to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Design and Develop Data Pipelines: Create data pipelines to extract data from various sources, transform it into a standardized format, and load it into a centralized data repository. Build and Maintain Data Infrastructure: Design, implement, and manage data warehouses, data lakes, and other data storage solutions. Ensure Data Quality and Integrity: Develop data validation, cleansing, and normalization processes to ensure data accuracy and consistency. Collaborate with Data Analysts and Business Process Owners: Work with data analysts and business process owners to understand their data requirements and provide data support for their projects. Optimize Data Systems for Performance: Continuously monitor and optimize data systems for performance, scalability, and reliability. Develop and Maintain Data Governance Policies: Create and enforce data governance policies to ensure data security, compliance, and regulatory requirements. Experience & Skills Hands-on experience in implementing, supporting, and administering modern cloud-based data solutions (Google BigQuery, AWS Redshift, Azure Synapse, Snowflake, etc.). Strong programming skills in SQL, Java, and Python. Experience in configuring and managing data pipelines using Apache Airflow, Informatica, Talend, SAP BODS or API-based extraction. Expertise in real-time data processing frameworks. Strong understanding of Git and CI/CD for automated deployment and version control. Experience with Infrastructure-as-Code tools like Terraform for cloud resource management. Good stakeholder management skills to collaborate effectively across teams. Solid understanding of SAP ERP data and processes to integrate enterprise data sources. Exposure to data visualization and front-end tools (Tableau, Looker, etc.). Strong command of English with excellent communication skills. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Fully remote possible Full Time 1 May 2025 Title Data Engineer Job Description A Data Engineer is responsible for designing, building, and maintaining large-scale data systems and infrastructure. Their primary goal is to ensure that data is properly collected, stored, processed, and retrieved to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Design and Develop Data Pipelines: Create data pipelines to extract data from various sources, transform it into a standardized format, and load it into a centralized data repository. Build and Maintain Data Infrastructure: Design, implement, and manage data warehouses, data lakes, and other data storage solutions. Ensure Data Quality and Integrity: Develop data validation, cleansing, and normalization processes to ensure data accuracy and consistency. Collaborate with Data Analysts and Business Process Owners: Work with data analysts and business process owners to understand their data requirements and provide data support for their projects. Optimize Data Systems for Performance: Continuously monitor and optimize data systems for performance, scalability, and reliability. Develop and Maintain Data Governance Policies: Create and enforce data governance policies to ensure data security, compliance, and regulatory requirements. Experience & Skills Hands-on experience in implementing, supporting, and administering modern cloud-based data solutions (Google BigQuery, AWS Redshift, Azure Synapse, Snowflake, etc.). Strong programming skills in SQL, Java, and Python. Experience in configuring and managing data pipelines using Apache Airflow, Informatica, Talend, SAP BODS or API-based extraction. Expertise in real-time data processing frameworks. Strong understanding of Git and CI/CD for automated deployment and version control. Experience with Infrastructure-as-Code tools like Terraform for cloud resource management. Good stakeholder management skills to collaborate effectively across teams. Solid understanding of SAP ERP data and processes to integrate enterprise data sources. Exposure to data visualization and front-end tools (Tableau, Looker, etc.). Strong command of English with excellent communication skills. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Role : Data : Jaipur/Indore Notice : Immediate joiners only Basic Responsibilities (Must-Haves) 5+ years of experience in dashboard story development, dashboard creation, and data engineering pipelines. Hands-on experience with log analytics, user engagement metrics, and product performance metrics. Ability to identify patterns, trends, and anomalies in log data to generate actionable insights for product enhancements and feature optimization. Collaborate with cross-functional teams to gather business requirements and translate them into functional and technical specifications. Manage and organize large volumes of application log data using Google Big Query. Design and develop interactive dashboards to visualize key metrics and insights using any of the tool like Tableau Power BI, or ThoughtSpot AI . Create intuitive, impactful visualizations to communicate findings to teams including customer success and leadership. Ensure data integrity, consistency, and accessibility for analytical purposes. Analyse application logs to extract metrics and statistics related to product performance, customer behaviour, and user sentiment. Work closely with product teams to understand log data generated by Python-based applications. Collaborate with stakeholders to define key performance indicators (KPIs) and success metrics. Can optimize data pipelines and storage in Big Query. Strong communication and teamwork skills. Ability to learn quickly and adapt to new technologies. Excellent problem-solving skills. Preferred Responsibilities (Nice-to-Haves) Knowledge of Generative AI (GenAI) and LLM-based solutions. Experience in designing and developing dashboards using ThoughtSpot AI. Good exposure to Google Cloud Platform (GCP). Data engineering experience with modern data warehouse architectures. Additional Responsibilities Participate in the development of proof-of-concepts (POCs) and pilot projects. Ability to articulate ideas and points of view clearly to the team. Take ownership of data analytics and data engineering solutions. Additional Nice-to-Haves Experience working with large datasets and distributed data processing tools such as Apache Spark or Hadoop. Familiarity with Agile development methodologies and version control systems like Git. Familiarity with ETL tools such as Informatica or Azure Data Factory (ref:hirist.tech) Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Role: Data QA Lead Experience Required- 8+ Years Location- India/Remote Company Overview At Codvo.ai, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results. The Data Quality Analyst is responsible for ensuring the quality, accuracy, and consistency of data within the Customer and Loan Master Data API solution. This role will work closely with data owners, data modelers, and developers to identify and resolve data quality issues. Key Responsibilities Lead and manage end-to-end ETL/data validation activities. Design test strategy, plans, and scenarios for source-to-target validation. Build automated data validation frameworks (SQL/Python/Great Expectations). Integrate tests with CI/CD pipelines (Jenkins, Azure DevOps). Perform data integrity, transformation logic, and reconciliation checks. Collaborate with Data Engineering, Product, and DevOps teams. Drive test metrics reporting, defect triage, and root cause analysis. Mentor QA team members and ensure process adherence. Must-Have Skills 8+ years in QA with 4+ years in ETL testing. Strong SQL and database testing experience. Proficiency with ETL tools (Airbyte, DBT, Informatica, etc.). Automation using Python or similar scripting language. Solid understanding of data warehousing, SCD, deduplication. Experience with large datasets and structured/unstructured formats. Preferred Skills Knowledge of data orchestration tools (Prefect, Airflow). Familiarity with data quality/observability tools. Experience with big data systems (Spark, Hive). Hands-on with test data generation (Faker, Mockaroo). Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Chennai, Bengaluru
Hybrid
Job Description: Role: Orcale, Informatica, PLSQL, ETL Location: Chennai/ Bangalore Experience: 5+ Years Must have: Orcale, Informatica, PLSQL, ETL Looking for a candidate with expertise on Oracle Database, Snaplogic and Oracle PL/SQL with knowledge on AWS cloud. The Value You Deliver: As a Software engineer, You build quality solutions that align with the technology blueprint and best practices to solve business problems by driving design, development, and ongoing support. You develop Oracle PL/SQL Stored Procedures for business functionality You develop Snaplogic pipelines for the integration and processing across the application multiple data stores You actively participate in release planning, daily stand up as well as working & helping team with tactical activities like code review, performance tuning, bug fix, design optimization etc. You understand key design aspects on Performance, Scalability, Resiliency, Security, and High Availability, etc. and follow recommended Engineering Practices. The Skills that are Key to this role: You have strong Design/Development skills in SQL and Oracle PL/SQL that includes performance tuning, writing packages, stored procedures, troubleshooting skills etc. You have strong Snaplogic Design/Development skills You have proven design thinking and solutioning ability to provide optimal solutions for complex data transformation usecases You have deep knowledge in Oracle database concepts and implementation experience You have solid experience in writing complex SQL queries on Oracle RDS and performance optimization for large data volumes with experience performing deep data analysis on multiple database platforms You have Hands-on experience building and deploying applications using a variety of Amazon Web Services (primarily database services) You are an excellent communicator with both technical and non-technical players to ensure common understanding of design to streamline and optimize data enablement. You have prior experience working in Agile software development environments, with proven ability to convert user stories into delivery work that provides incremental, iterative delivery of business value. You need to bring passion to your work and operate with a sense of purpose that inspires and motivates those you work with. You are expected to be intellectually curious, take initiative, and love learning new skills and capabilities. You should be able to switch focus based on changing priorities within fast paced team The Expertise We are Looking For: Minimum 5 years of Relevant experience bringing a data-driven approach to strategic business decision-making Experience with Oracle Database, SQL, Snaplogic, Oracle PL/SQL, Informatica, CTRL M, Scripting (PowerShell, python etc.) Certification in AWS, Snaplogic is a great value add Ability to generate meaningful insights through data analytics and research and to present complex data, financial analyses, statistics, and research findings in a simple, clear and actionable way Excellent communication skills and ability to interact with all levels of end users and technical resources Bachelors or Masters degree in computer science, or Information Technology The Skills that are Good to Have for this role: Having knowledge in Aerospike, PowerBI and visualization tools like Qlik/Alteryx is preferred Exposure to Finance and Market is preferred Ability to perform independent technical analysis on complex projects. Location for This Role : Chennai/ Bangalore Shift timings : 11:00 am - 8:00pm (these are the official working hours so as to enable overlap with the US; that said, associates can exercise flexi logins/logouts from office and work remote for calls/meetings with the US)
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Please note, this is a 12 month initial contract, with the possibility of extensions. This role is hybrid in 560037, Bengaluru. Insight Global are looking for a Data Management Business Analyst to join one of their premium clients in the financial services space. You will play a pivotal role in bridging the gap between business needs and technical solutions, with a strong emphasis on data governance and data management. You will ensure that the companies data assets are effectively governed, secure, and aligned with business objectives with a specific focus on supporting the capture of data lineage across the technology estate. You will be the liaison for internal stakeholders when it comes to understanding requirements. You may also be involved in manipulating data at the same time. Must haves: 5+ years' experience in a Business Analyst and/or Data Analyst role with a focus on data governance, data management, or data quality Strong technical understanding of data systems, including databases (for example, SQL), data modelling, and data integration tools Proficiency in data analysis tools and techniques (such as Python, R, or Excel) Experience in developing and implementing data governance frameworks, policies, or standards Excellent communication and stakeholder management skills, with the ability to translate complex technical concepts into simplified business language Experience creating business requirement documentation (BRD) Strong understanding of regulatory compliance requirements related to data (for example GDPR, DORA, or industry-specific regulations) Bachelorβs degree in a relevant field such as Computer Science, Information Systems, Data Science, Business Administration, or equivalent Plusses: Hands-on experience with data governance tools (such as Collibra, Informatica or Solidatus) Familiarity with cloud-based data platforms (such as Azure, AWS or Google Cloud) Knowledge of modern data platforms (for example Snowflake, Databricks or Azure Data Lake) Knowledge of data visualization tools for presenting insights (for example Tableau or Power BI) Experience writing user stories Experience working in an Agile environment (using tools such as Jira is advantageous) Experience working in financial services or other regulated industries Understanding of machine learning or advanced analytics concepts An advanced degree in Data Science, Business Analytics, or related fields Professional certifications in business analysis (such as CBAP, CCBA), data analysis, or data governance (such DAMA CDMP, CISA) are highly desirable Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education "Summary:As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the functionality and efficiency of the applications. This role requires a strong understanding of Oracle Procedural Language Extensions to SQL (PLSQL) and the ability to work collaboratively with the team to provide solutions to work-related problems. Roles & Responsibilities:- Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze requirements. - Design, develop, and test PLSQL code to meet business needs. - Troubleshoot and debug application issues to ensure optimal performance. - Optimize database queries and improve application performance. - Document technical specifications and user guides for developed applications. Professional & Technical Skills:- Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL) and Informatica - Strong understanding of database concepts and SQL. - Experience in performance tuning and query optimization. - Knowledge of software development life cycle (SDLC) methodologies. - Familiarity with version control systems such as Git or SVN. Additional Information:- The candidate should have a minimum of 4 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL). - This position is based in Gurugram. - A 15 years full-time education is required." Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : Oracle Applications Development, ControlM Administration Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Support Engineer, you will support multiple applications in Acceptance to meet business process and application requirements. Roles & Responsibilities: - Support Batch runs and provide solution in case of failures in Acceptance - Deployment of the code in acceptance environments. - Collaborate with different application teams and resolve batch/deployment issues if any - Provide services during acceptance test batch run, as agreed between the Engagement Managers at a later point in time. - Acceptance support to handle the issues if any post deployment. - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Implement and maintain data pipelines. - Provide System documentation. Professional & Technical Skills: - Must Have Skills: Proficiency in Control M, Informatica PowerCenter, SQL - Strong understanding of database concepts. - Experience with SQL, relational databases, HDFS, Yarn - Must have experience working on Control M -Good to have Cloudera Manager experience - Knowledge of data warehousing concepts. Additional Information: - The candidate should have a minimum of 5 years of experience in Control M, Informatica PowerCenter - A 15 year full-time education is required Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the smooth functioning of applications and addressing any issues that may arise. Your typical day will involve collaborating with the team to understand requirements, designing and developing applications, and testing and debugging code to ensure optimal performance and functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with the team to understand application requirements. - Design and develop applications based on business process requirements. - Test and debug code to ensure optimal performance and functionality. - Address any issues or bugs that arise in the applications. - Provide technical support and guidance to end-users. - Stay updated with the latest industry trends and technologies. - Assist in the deployment and maintenance of applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio. - The candidate should have a minimum of 4 years of experience in Ab Initio. - Strong understanding of data integration and ETL concepts. - Experience in designing and developing ETL workflows using Ab Initio. - Knowledge of database concepts and SQL. - Familiarity with data warehousing and data modeling. - Good To Have Skills: Experience with other ETL tools such as Informatica or DataStage. Additional Information: - The candidate should have a minimum of 4 years of experience in Ab Initio. - This position is based at our Pune office. - A 15 years full-time education is required. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of data architecture to support data initiatives. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of data integration techniques and ETL processes. - Experience with data profiling and data cleansing methodologies. - Familiarity with database management systems and SQL. - Knowledge of data governance and data quality best practices. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Data Quality. - This position is based at our Hyderabad office. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
India
Remote
Role: Senior Azure / Data Engineer with (ETL/ Data warehouse background) Location: Remote, India Duration: Long Term Contract Need with 10+ years of experience Must have Skills : β’ Min 5 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks, etc. Azure experience is preferred over other cloud platforms. β’ 10 + years of proven experience with SQL, schema design, and dimensional data modeling β’ Solid knowledge of data warehouse best practices, development standards, and methodologies β’ Experience with ETL/ELT tools like ADF, Informatica, Talend, etc., and data warehousing technologies like Azure Synapse, Azure SQL, Amazon Redshift, Snowflake, Google Big Query, etc.. β’ Strong experience with big data tools(Databricks, Spark, etc..) and programming skills in PySpark and Spark SQL. β’ Be an independent self-learner with a βletβs get this doneβ approach and the ability to work in Fast paced and Dynamic environment. β’ Excellent communication and teamwork abilities. Nice-to-Have Skills: β’ Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. β’ SAP ECC /S/4 and Hana knowledge. β’ Intermediate knowledge on Power BI β’ Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes Show more Show less
Posted 1 week ago
2.0 - 4.0 years
8 - 11 Lacs
Chennai
Hybrid
This is an operational role responsible for providing data analysis & management support. The incumbent may seek appropriate level of guidance and advice to ensure delivery of quality outcomes. Responsibilities Gathering and preparing relevant data to use in analytics applications. Acquiring data from primary or secondary data sources and support in maintaining databases Identify, analyze, and interpret trends or patterns in data sets. Filter and clean data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems. Develop and Support ETL Jobs, Schedule batch jobs via CRON, Database modeling for RDBMS Gather data requirements, follow Scrum methodology, ownership from development to deployment Minimum qualification & experience 6 Months -2 Years of DB Programming and any ETL tool (prefer Pentaho)/ Data Engineering Desired Skill sets Database programming RDMBS like oracle/mysql/maria DB ETL tool (Informatica / prefer pentaho) Production support experience is desirable DB Design Python skills are value added UNIX commands and job monitoring and debugging skills
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Intelligent Cloud Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Intelligent Cloud Services. - Strong understanding of application development methodologies. - Experience with cloud-based application deployment and management. - Familiarity with data integration and transformation processes. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Intelligent Cloud Services. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
8.0 - 11.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of β¬5.8 billion. Job Description The world is how we shape it. Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development is good to have. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Title: Head of R&D AI Engineering About The Job At Sanofi, weβre committed to providing the next-gen healthcare that patients and customers need. Itβs about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our R&D Data & AI Products and Platforms Team as Head of R&D AI Engineering and you can help make it happen. What You Will Be Doing Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives. The R&D Data & AI Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities. As a Head of R&D AI Engineering , you will be a leader on a dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work with R&D-based Product and Product Line Owners and manage engineering teams within an agile environment to drive the design, development, and deployment of robust, scalable, and performance-based AI/ML endpoints and workflows. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofiβs Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience build products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renown leaders and academics in machine learning to further develop your skillsets. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve peopleβs lives. Weβre also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main Responsibilities AI Product Engineering (workflows, API endpoints, databases): Develop and execute a vision & strategy for AI/ML product development Lead AI product development within R&D in alignment with R&D data strategy, business goals, and objectives, making AI models accessible and usable by end users through the deployment of AI/ML models and exposure through APIs/endpoints Use best practices to implement bias protection and foster fairness and transparency within AI/ML technologies; consult with Responsible AI stakeholders to ensure compliance, ethical, and responsible use Collaborate closely with AI/ML model development team to coordinate on development handover activities Manage project timelines, resource allocation, and priorities to ensure timely delivery of AI/ML products engineering activities Innovation & Team Leadership Build and lead a high-performing team engineers, providing oversight, mentorship, training, resources, and best practices to empower the team to deliver engineering with excellence Design, implement, and continuously improve a model for AI/ML product delivery within the Data and AI Product Delivery Team Provide specific guidance and escalation support for any challenges faced by AI Engineers during the delivery of AI products Communicate and advise against AI/ML strategies, potential, limitations, progress, and results to stakeholders and leadership to ensure alignment across R&D business Cross-Team Collaboration Collaborate and coordinate with Platform and Data Engineering Teams to maintain data pipelines and consistent ways of working across teams Partner with Enterprise Digital and oversee AI/ML data protection, governance, and compliance About You Key Functional Requirements & Qualifications: Masterβs in computer science, engineering, AI/ML, or a related field; 8-10 years of experience in software engineering, data science, AI, or analytics and 6+ years of experience in AI/ML engineering or related field Proven track record leading successful AI/ML data product projects across design, development, scaling, and maintenance Understanding of R&D business and data environment strongly preferred Strong communication between technical and non-technical stakeholders and collaborators Demonstrated team leadership, mentorship, and management skill Key Technical Requirements & Qualifications Strong expertise in machine learning, deep learning, natural language processing, and other related ML/AI technologies Expert experience with AI/ML frameworks and tools and in programming languages such as Python, R, Java Expert experience with the design and development of APIs/Endpoints (e.g., Flask, Django, FastAPI) Expertise in cloud platforms and software involved in the deployment and scaling of AI/ML models Expertise with data analytics and statistical software (incl. SQL, Python, Java, Excel, AWS, Snowflake, Informatica) Why Choose Us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether itβs through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs Sanofi careers - it all starts with you! Pursue Progress Discover Extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesnβt happen without people β people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, letβs be those people. Watch our ALL IN video and check out our Diversity, Equity and Inclusion actions at sanofi.com! Sanofi is an equal opportunity employer committed to diversity and inclusion. Our goal is to attract, develop and retain highly talented employees from diverse backgrounds, allowing us to benefit from a wide variety of experiences and perspectives. We welcome and encourage applications from all qualified applicants. Accommodations for persons with disabilities required during the recruitment process are available upon request. null Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Applause is raising the bar for digital quality and employee experience. Recognized as a Top Workplace, Applause provides award-winning software testing and UX research solutions to top brands. Our fully managed services leverage a global team and the world's largest independent testing community. We improve digital experiences for global innovators like Google, Microsoft, PayPal, Starbucks, Vodafone, and BMW. As a Business Intelligence Analyst you will be part of our global Data and Analytics team. This position will play a key role in maintaining and enhancing our enterprise business intelligence environment. This individual will form relationships with subject matter experts across the company and business leaders to help enhance business decisions and reporting capabilities by having a strong data background. The right candidate will exhibit outstanding data understanding, a drive to learn new systems and business data, and the ability to thrive in a fast-paced, and sometimes ambiguous, work environment. Key Responsibilities: Available to work until 10:30 PM IST to ensure effective collaboration with global teams. Collaborating with business users and stakeholders to understand their data analysis and reporting requirements. Identifying the key metrics, dimensions, and data sources needed for the Qlik applications. Designing and implementing the data model within the Qlik environment. This includes extracting, transforming, and loading (ETL) data from various sources, creating data connections, and defining relationships between data tables. Developing interactive dashboards, visualizations, and reports using Qlik's data visualization tools. Designing and implementing user-friendly interfaces that allow users to explore data, apply filters, and drill down into details. Writing and maintaining Qlik scripting to load and transform data from different sources. This involves data cleansing, aggregation, joining tables, and implementing complex calculations or business logic. Writing, modifying, testing, and verifying SQL queries based on business requirements. Optimizing Qlik applications for performance and efficiency. Identifying and resolving issues related to data model design, data loading, scripting, or visualizations to ensure optimal application responsiveness and speed. Conducting thorough testing of Qlik applications to ensure data accuracy, functionality, and performance Documenting the design, development process, and application functionalities for future reference in Jira and internal training documentation Creating user guides and providing training to end-users on how to use the Qlik applications effectively. Designing and building complex BI solutions that have a global perspective, but can be flexible for regional-specific requirements. Working with colleagues across the company to obtain requirements, business logic, and technical details for BI solutions. Determining and scheduling data jobs during optimum business hours. Working closely with and collaborating with team members on initiatives. Maintaining high standards of data quality and integrity. Taking lead on projects, but collaborating with team members. Job Requirements and Preferred Skills: 5+ years working with Qlik Sense, Qlik View, or Qlik Cloud, other BI tool experience may be considered. 5+ years of Business Intelligence experience 5+ years of SQL writing experience Experience with Fivetran, Snowflake, Hightouch, Informatica, or other related tools is a plus Strong analytical skills to troubleshoot databases and data issues and identify and solutions. A clear sense of urgency and a desire to learn. Ability to manage communications effectively with various cultures and across multiple time zones across the globe. Excellent organizational, analytical, problem-solving and communication skills. Team player with solid communication and presentation skills. Why Applause? Weβre proud to cultivate an inspiring, engaging employee culture thatβs consistently reflected in high employee retention rates and satisfaction. Our talented team - known as Applause Nation - is set up for success with the latest collaboration and learning tools, opportunities for career advancement, and more. We have a flexible work environment with top talent from across the globe Collaborate with an international team of 450+ passionate, talented co-workers Expand your portfolio with exciting, hands-on projects providing exposure to well-known, global brands Learn and grow through structured onboarding, in-house knowledge sessions and access to thousands of virtual courses available on demand Incorporate AI and other exciting technologies into your work, to help you prioritize and boost productivity Experience a supportive culture that emphasizes teamwork, innovation and transparency Share your voice! Contribute and integrate creative and innovative ideas across roles and departments Applause Core Values: As a global employee community, we strive to uphold the following core values, which are critical to business success and how we measure individual and team performance. Do you share our core values? Be Accountable: You love to take ownership, and hold yourself and others accountable to increase empowerment and success. Celebrate Authenticity: You love bringing your true self to work and creating genuine and trustful relationships within a diverse environment. In It Together: You have a team-first mindset and love collaborating with your peers. Create Value for Our Customers: You love delivering meaningful business impact and being a release partner for all aspects of digital quality. Crush Your Goals: You always strive for excellence and constantly seek ways to be better, more effective and more efficient. Show more Show less
Posted 1 week ago
10.0 - 15.0 years
30 - 35 Lacs
Chennai, Bengaluru
Work from Office
Principal Architect (Data and Cloud) - Neoware Technology Solutions Private Limited Principal Architect (Data and Cloud) Requirements More than 10 years of experience in Technical, Solutioning, and Analytical roles. 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure). Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc. Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud. Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud. Experience of having worked with traditional ETL tools like Informatica / DataStage / OWB / Talend , etc. Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift. Experience in having worked on one or more data integration, storage, and data pipeline tool sets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc. Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design. Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. Good understanding of BI Reporting and Dashboarding and one or more tool sets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure. Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure. Be a trusted technical advisor to customers and solutions for complex Cloud & Data related technical challenges. Be a thought leader in architecture design and development of cloud data analytics solutions. Liaison with internal and external stakeholders to design optimized data analytics solutions. Partner with SMEs and Solutions Architects from leading cloud providers to present solutions to customers. Support Sales and GTM teams from a technical perspective in building proposals and SOWs. Lead discovery and design workshops with potential customers across the globe. Design and deliver thought leadership webinars and tech talks alongside customers and partners. Responsibilities Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. Interface with multiple stakeholders within IT and business to understand the data requirements. Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction. Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems. Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data. Mentor young Talent within the Team, Define and track their growth parameters. Contribute to building Assets and Accelerators.
Posted 1 week ago
4.0 - 9.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Design, develop, validate, and troubleshoot the ETL workflows and datasets in Alteryx for analysis, forecasting, and report delivery and update the current SAS DI studio jobs to migrate the data to the cloud using Alteryx. Skills Required 4 years of experience working with Alteryx. Sound knowledge of Alteryx designing, server, and tools like predictive, parsing, and transforms. Good foundation in writing SQL queries against any RDBMS. Basic knowledge of tools such as Informatica, Talend Pentaho, etc Ability to troubleshoot, analyze, and solve problems. Good understanding of software change and release management.
Posted 1 week ago
4.0 - 6.0 years
8 - 14 Lacs
Chennai
Work from Office
Must have skills : - Bachelor's/master's in engineering, Computer Science, or equivalent experience - 5-6 years of experience in the IT industry, experience in Data space is preferred. - Working experience in GCP-BQ. - Good knowledge in Teradata or Oracle. - Experience in Data modelling. - Advanced scripting experience - Python, Shell, etc. - Strong analytical skills including the ability to define problems, collect data, establish facts, and draw valid conclusions - Knowledge of Scheduling Tools (preferably Airflow, UC4) is a plus - Working knowledge on any ETL tool (i.e, Informatica) is a plus. - Excellent written and oral communication skills - Familiarity with data movement techniques and best practices to handle large volumes of data - Strong communication skills and willingness to take initiative to contribute beyond core-responsibilities Responsibilities : In this role, the individual will be part of the Credit data engineering team within Credit Platform organization and have the following responsibilities : - Design and implement an integrated credit data platform that is extremely high-volume, fault-tolerant, scalable backend systems that process and manage petabytes of customer data. - Should adopt long term/strategic thought process during the entire project life cycle. - Participating and collaborating with cross functional teams in the organization to understand the business requirements and to deliver solutions that can scale.
Posted 1 week ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Required Skills: Need experienced Informatica MDM Engineers with strong ETL and data integration expertise to design, develop 5+ years of hands-on experience with Informatica PowerCenter or IICS. Exposure to data quality tools, MDM, or real-time data integration. Strong experience with ETL design and performance tuning. Proficiency in writing complex SQL queries and working with relational databases (Oracle, SQL Server, etc.). Solid understanding of data warehousing concepts, data modeling, and architecture. Familiarity with job scheduling tools and version control systems.
Posted 1 week ago
2.0 - 4.0 years
7 - 11 Lacs
Gurugram
Work from Office
A Software Engineer is curious and self-driven to build and maintain multi-terabyte operational marketing databases and integrate them with cloud technologies. Our databases typically house millions of individuals and billions of transactions and interact with various web services and cloud-based platforms. Once hired, the qualified candidate will be immersed in the development and maintenance of multiple database solutions to meet global client business objectives Job Description: Key responsibilities: Have 2 - 4 yrs exp Will work in close Supervision of Tech Leads/ Lead Devs Should able to understand detailed design with minimal explanation. Individual Contributor. Resource will able to perform mid to complex level tasks with minimal supervision. Senior team members will peer review assigned tasks. Build and configure our Marketing Database/Data environment platform by integrating feeds as per detailed design/transformation logic. Good knowledge of Unix scripting &/or Python Must have strong knowledge in SQL Good understanding of ETL (Talend, Informatica, Datastage , Ab Initio etc ) as well as database skills (Oracle, SQL server, Teradata, Vertica, redshift, Snowflake, Big query, Azure DW etc ). Fair understanding of relational databases, stored procs etc. Experience in Cloud computing (one or more of AWS, Azure, GCP) will be plus . Less supervision & guidance from senior resources will be required .
Posted 1 week ago
2.0 - 4.0 years
4 - 6 Lacs
Gurugram
Work from Office
Title Analyst Programmer Department AMO (ISS) Production Support Location Gurugram Level 2 About Fidelity International: Fidelity International offers investment solutions and services and retirement expertise to more than 2.56 million customers globally. As a privately-held, purpose-driven company with a 50-year heritage, we think generationally and invest for the long term. Operating in more than 25 locations and with $783.6 billion in total assets, our clients range from central banks, sovereign wealth funds, large corporates, financial institutions, insurers and wealth managers, to private individuals. Our Workplace & Personal Financial Health business provides individuals, advisers and employers with access to world-class investment choices, third-party solutions, administration services and pension guidance. Together with our Investment Solutions & Services business, we invest $567 billion on behalf of our clients. By combining our asset management expertise with our solutions for workplace and personal investing, we work together to build better financial futures. Our clients come from all walks of life and so do we. We are proud of our inclusive culture and encourage applications from the widest mix of talent, whatever your age, gender, ethnicity, sexual orientation, gender identity, social background and more. We are a disability-friendly company and would welcome a conversation with you if you feel you might benefit from any reasonable adjustments to perform to the best of your ability during the recruitment process and beyond. We are committed to being a truly flexible employer, encouraging and trusting our people to perform their role in the way that works best for them, our business, our colleagues and our clients. We offer the maximum possible flexibility over where and when you work for all, considering your role and any local regulations. We call this new approach dynamic working . Department Description: AMO (ISS) production support consists of applications like Global Fund Data Repository (GFDR), Product hub, Performance Hub, Product (FRD), Reference Data Service, Transaction Service, Position Service, Frontier, Fund Distribution Service etc. architecture and engineering services that comprises of various Fidelity s Business Units in the UK and other parts of Europe, Asia and is a strategic area targeted for growth over the coming years. Various key systems have been acting as the key enablers for the business in achieving their goals. The Enterprise portfolio of projects will include a large collection of strategic initiatives as well as tactical ones to support day-to-day operations and strengthen the environment. The support team aims at supporting & maintaining global data warehouse acting as the single source of data for various line of business s to help them in the MI reporting requirements and data analysis. This source of data is considered as the golden source for distribution data and helps various business groups across organization to take knowledge based decisions. Purpose of the Role: The position is for an Application Programmer in AMO production Support team. The role involves supporting key AMO - Enterprise applications and data marts involving strong PL/SQL and stored procedure knowledge on Oracle database platform. The candidate should have high expertise and core skills of Informatica and UNIX shell script. In addition, hands-on experience with Control-M technologies would be a plus. The successful candidate will be responsible to support for consumption of downstream feeds and applications in varied technologies. This would also involve intensive interaction with the business and other systems groups, so good communications skills and the ability to work under pressure are absolute must. Key Responsibilities: The candidate is expected to display professional ethics in his/her approach to work and exhibit a high level ownership within a demanding working environment. Providing first line of technical support for business critical applications (Principal technologies / applications used include Oracle, UNIX , PaaS, Python, Java and Control-M). Work in the support team alongside data analysts, business analysts, database administrators and business project teams in enhancing and supporting the production services. Help maintain Control-M schedules. Conduct analysis and do bug fixes for production incidents. Carry out technical enhancements as desired. Carry out daily health-check activities involving application checks, system checks, and database checks and related on production systems / servers. The scope of responsibility also covers monitoring business critical batch workloads, real-time / interactive processing, data transfer services, application on-boarding and upgrades, and recovery procedures. Report root cause of the incidents and present ideas on how to prevent the incidents from occurring in future. Ensure adherence to incident and change management processes. Regular engagement with Business & Systems Teams looking to adopt and apply the best practice of Service Support. Prepares and maintains documentation related application support like SOM, Service Card, Support Rota, Knowledge base, etc. Demonstrates continuous effort to improve operations, decrease turnaround times, streamline work processes, and work cooperatively and jointly to provide quality seamless customer service. Responsible for servicing 24x7 support as per support rosters. Flexibility to work in shifts ( on-demand & short-term basis), and/or on weekends. Experience and Qualifications Required: Around 2 - 4 years of technical experience in Software / IT industry in Development and Support functions Minimum 2 - 4 years of support experience in Production Support roles Essential Technical skills: At least 2-4 years of Oracle experience with strong focus on SQL. PL/SQL knowledge is good to have. Basic understanding of PaaS technology, Python, Core Java and web services/ REST API. Should have core skills of UNIX shell script. Essential behavioural/operational skills: Ability to apply new skills / additional information acquired in relation to role. Ability to interact with end users/business users. Ability to work closely with cross functional teams including Infrastructure teams/Architects/Business Analysts. Ability to prioritise own activities, work under hard deadlines. Team player with commitment to achieve team goals. Motivated, flexible and with a can do approach. Keen to learn and develop proficiency Good communication skills both verbal and written. Delivery and results focused. Good to have technical skills: Hands-on experience with scheduling tools - Control-M would be a definite plus. Experience in informatica is good to have. Experience of any source control tool - SVN would be a plus. Good Operating Systems knowledge and associated commands (UNIX [Linux/AIX], MS Windows). Familiarity in Data Warehouse, Datamart and ODS concepts. Knowledge of essential Software Engineering principles. Knowledge of ITIL practices.
Posted 1 week ago
0.0 - 5.0 years
11 - 13 Lacs
Bengaluru
Work from Office
Design, develop, and maintain data integration workflows using Informatica IICS (Cloud Data Integration and Application Integration). Develop and optimize ETL solutions using Informatica PowerCenter. Work on Snowflake to support data warehousing solutions, including data ingestion, transformation, and performance tuning. Write efficient and optimized SQL and PL/SQL queries for data extraction, transformation, and validation. Develop and support Unix Shell Scripts for automation and job scheduling. Collaborate with business and technical stakeholders to understand requirements and deliver scalable solutions. Participate in design reviews, code reviews, and performance tuning exercises. Contribute to cloud migration and modernization initiatives, particularly on Azure.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16869 Jobs | Dublin
Wipro
9024 Jobs | Bengaluru
EY
7266 Jobs | London
Amazon
5652 Jobs | Seattle,WA
Uplers
5629 Jobs | Ahmedabad
IBM
5547 Jobs | Armonk
Oracle
5387 Jobs | Redwood City
Accenture in India
5156 Jobs | Dublin 2
Capgemini
3242 Jobs | Paris,France
Tata Consultancy Services
3099 Jobs | Thane