Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary: We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 8 - 12+ years Work Location: Hyderabad (Hybrid) Mandatory skills: Python, SQL, Snowflake Contract to Hire - 6+ months Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 8 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and related frameworks. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dear Associate Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. We have a job opportunity for ETL Test Engineer at Tata Consultancy Services. Hiring For: ETL Test Engineer Interview date: 14-June-25 In-person Drive Location: Bangalore Experience: 4-10 years Must Have: 1. SQL - Expert level of knowledge in core concepts of SQL and query. 2.Lead and mentor a team of ETL testers, providing technical guidance, training, and support in ETL tools, SQL, and test automation frameworks. 3.Create and review complex test cases, test scripts, and test data for ETL processes. 4. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 5.Execute test cases, validate data transformations, and ensure data accuracy and consistency across source and target systems 6. Experience in query optimization, stored procedures/views and functions. 7. Strong familiarity of data warehouse projects and data modeling. 8. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 9. Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, Power BI/Cognos/Informatica, etc.) 10.Develop and maintain ETL test automation frameworks to enhance testing efficiency and coverage. 11. Integrate automated tests into the CI/CD pipeline to ensure continuous validation of ETL processes. 12. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 13. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 14. Communication - Good communication skills to understand and collaborate with all the stake holders within the project If you are interested in this exciting opportunity, please share your updated resume on saranya.devi3@tcs.com along with the additional information mentioned below: Name: Preferred Location: Contact No: Email id: Highest Qualification: University/Institute name: Current Organization Willing to relocate Bangalore : Total Experience: Relevant Experience in (ETL Test Engineer): Current CTC: Expected CTC: Notice Period: Gap Duration: Gap Details: Available for In-Person interview on 14-June-25: Timings: Attended interview with TCS in past(details): Please share your I begin portal EP id if already registered: Note: only Eligible candidates with Relevant experience will be contacted further. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
India
On-site
Mandatory skill - Salesforce FSC - Insurance * 7+ years of experience in a functional consulting role within the Insurance industry, with a significant focus on pre-sales activities. * Proven experience with configuring Salesforce FSC components such as Action Plans, Interaction Summaries, and Relationship Groups, specifically tailored to insurance workflows. * Salesforce FSC Configuration: Deep understanding of FSC data models and features (e.g., Action Plans, Relationship Groups, Interaction Summaries). * Proficiency in using Flows, Process Builder, Validation Rules, and Custom Objects to build solutions without code. * Ability to customize and extend standard FSC objects to align with insurance-specific entities like policies, claims, and agents. * Familiarity with APIs and integration tools (e.g., Mulesoft, Dell Boomi) to connect Salesforce with core insurance systems. * Expertise in building real-time analytics and dashboards for insurance KPIs (e.g., policy volume, claims cycle time). * Familiarity with APIs and middleware platforms like Mulesoft, Dell Boomi, or Informatica to integrate Salesforce with insurance core systems. * Salesforce Financial Services Cloud Consultant Certification is preferred. Scope of work - * Lead discovery sessions with insurance business stakeholders to understand their key challenges, processes, and objectives across underwriting, claims, servicing, and other relevant areas. * Gather and document detailed business requirements, translating them into functional specifications for Salesforce FSC solutions. * Design and configure Salesforce FSC solutions leveraging declarative tools, specifically tailored to address the unique needs of insurance processes. * Collaborate closely with cross-functional teams (including technical architects and sales teams) to ensure alignment between business needs and technical feasibility. * Support testing efforts, user training, and drive user adoption, including participation in User Acceptance Testing (UAT) and issue resolution. * Provide on-demand consulting and solution recommendations to prospective clients based on insurance industry best practices and your deep knowledge of Salesforce FSC capabilities. Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description We are looking for an experienced SAP HANA/SQL Developer with 6 to 10 years of hands-on experience in analytics, data warehousing, and ETL development. The ideal candidate should be highly skilled in complex SQL programming, SAP HANA development, and modern ETL tools such as Azure Data Factory , SAP BODS , Informatica , or SSIS . You will be responsible for analyzing requirements, designing scalable data solutions, and implementing high-performance applications with strong attention to detail, performance, and security. Key Responsibilities Understand and analyze business requirements to convert them into technical design documents. Design, develop, and optimize SAP HANA data models (views, procedures, indexes). Develop and troubleshoot complex SQL queries, stored procedures, and scripts. Implement ETL solutions using Azure Data Factory, BODS, Informatica, or SSIS. Collaborate with cross-functional teams to deliver high-quality analytics and reporting solutions. Conduct root cause analysis and performance tuning of data models and ETL processes. Prepare technical documentation and training materials as needed. Provide accurate effort estimations for new and enhancement work. Ensure compliance with coding standards, security guidelines, and performance benchmarks. Actively contribute to continuous improvement and process optimization. Required Skills 6–10 years of experience in SQL development and SAP HANA. Hands-on experience with complex SQL programming, stored procedures, and performance tuning. Strong experience with at least one ETL tool: Azure Data Factory / SAP BODS / Informatica / SSIS. Solid understanding of data warehouse concepts and reporting architecture. Familiarity with Cloud environments like Azure SQL, Azure Data Factory is a plus. Strong problem-solving, analytical, and communication skills. Self-starter with the ability to take ownership and work in a fast-paced team environment. Preferred Qualifications Bachelor’s/Master’s degree in Computer Science, Information Technology, or related field. SAP certification in HANA or Data Management tools is an added advantage. Knowledge of Agile/Scrum methodologies. Skills: ssis,sql,sap hana,sap hana development,problem-solving,analytical skills,data warehousing,sap bods,azure,technical documentation,data models,sql programming,sap,bods,performance tuning,hana,etl,communication skills,etl development,informatica,azure data factory Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Azure Data Engineer Location: Gurugram We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. 🧠 What You'll Do 🔹 Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory 🔹 Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable 🔹 Work on modern data lakehouse architectures and contribute to data governance and quality frameworks 🎯 Tech Stack ☁️ Azure | 🧱 Databricks | 🐍 PySpark | 📊 SQL 👤 What We’re Looking For ✅ 3+ years of experience in data engineering or analytics engineering ✅ Hands-on with cloud data platforms and large-scale data processing ✅ Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes Best Regards, Santosh Cherukuri Email: scherukuri@bayonesolutions.com Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We have an exciting job opportunity - Lead Data Engineer : Snowflake Experience - 7 years to 10 years only Mandate Skills - min of 3 yrs+ experience with Snowflake - either migrating from another DB to SF or pulling data into SF. SQL, any ETL tool - SSIS, Talend, Informatica, Data Bricks, ADF (preferred), etc, Team Handling exp Location - Hyderabad / Chennai Mode - Hybrid Notice Period - Immediate Joiner to 15 Days Job Summary: · We are seeking an experienced Lead Snowflake Data Engineer to join our Data & Analytics team. This role involves designing, implementing, and optimizing Snowflake-based data solutions while providing strategic direction and leadership to a team of junior and mid-level data engineers. The ideal candidate will have deep expertise in Snowflake, cloud data platforms, ETL/ELT processes, and Medallion data architecture best practices. The lead data engineer role has a strong focus on performance optimization, security, scalability, and Snowflake credit control and management. This is a tactical role requiring independent in-depth data analysis and data discovery to understand our existing source systems, fact and dimension data models, and implement an enterprise data warehouse solution in Snowflake. Essential Functions and Tasks: · Lead the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team. · Architect and implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies. · Optimize Snowflake database performance, storage, and security. · Provide guidance on Snowflake best practices · Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers, to define and implement data solutions. · Ensure data quality, integrity, and governance across the organization. · Provide technical leadership and mentorship to junior and mid-level data engineers. ·Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform. Education and Experience Requirements: · Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. · 7+ years of experience in-depth data engineering, with at least 3+ minimum years of dedicated experience engineering solutions in a Snowflake environment. ·Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. ·Strong experience with cloud platforms (preference to Azure) and their data services. · Proficiency in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran. · Hands-on experience with scripting languages like Python for data processing. · Strong understanding of data governance, security, and compliance best practices. · Snowflake SnowPro certification; preference to the engineering course path. · Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). · Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. · Familiarity with BI and visualization tools such as PowerBI Knowledge, Skills, and Abilities: · Familiarity working in an agile scrum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives. · Ability to self-manage large complex deliverables and document user stories and tasks through Azure DevOps. · Personal accountability to committed sprint user stories and tasks · Strong analytical and problem-solving skills with the ability to handle complex data challenges · Ability to read, understand, and apply state/federal laws, regulations, and policies. · Ability to communicate with diverse personalities in a tactful, mature, and professional manner. · Ability to remain flexible and work within a collaborative and fast paced environment. · Understand and comply with company policies and procedures. · Strong oral, written, and interpersonal communication skills. · Strong time management and organizational skills. About Our Client: Our client is a leading business solutions provider for facility-based physicians practicing anesthesia, emergency medicine, hospital medicine, and now radiology, through the recent combining of forces with Advocate RCM. Focused on Revenue Cycle Management and Advisory services.Having grown consistently every year they have now grown to over 5000 employees headquartered in Dallas, US. Kindly share your updated resume ankita.jaiswal@firstwave-tech.com Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
India
On-site
Job Summary: We are seeking a skilled Semarchy MDM Consultant/Developer to join our data management team. The ideal candidate will have hands-on experience with Semarchy xDM and a deep understanding of MDM concepts, data modeling, data integration, and data governance. Key Responsibilities: Design, develop, and implement Master Data Management (MDM) solutions using Semarchy xDM. Develop and configure data models, entities, match & merge rules, workflows, and data validations within the Semarchy platform. Integrate Semarchy with various data sources (ETL tools, APIs, databases). Collaborate with business analysts and data stewards to gather requirements and implement effective MDM strategies. Ensure data quality, consistency, and governance across business domains. Create documentation for technical designs, data flows, configurations, and operational processes. Monitor and optimize MDM performance and troubleshoot issues. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 4+ years of experience in Master Data Management (MDM) implementations. 2+ years of hands-on experience in Semarchy xDM development and configuration. Strong SQL skills and knowledge of relational databases (e.g., Oracle, SQL Server, PostgreSQL). Experience with data integration tools (e.g., Talend, Informatica, Apache NiFi) is a plus. Understanding of MDM domains like Customer, Product, Supplier, etc. Experience in REST/SOAP APIs, data profiling, and data quality tools is an advantage. Good understanding of data governance, stewardship, and metadata management. Excellent problem-solving skills and communication abilities. Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: ETL Test Engineer Experience range: 4-10 years Location: Current location must be Bangalore ONLY NOTE: Candidate interested for a walk-in drive in Bangalore must apply Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Infrastructure Lead/Architect Job Type: Full-Time Location: On-site Hyderabad, Pune or New Delhi Job Summary Join our customer's team as an Infrastructure Lead/Architect and play a pivotal role in architecting, designing, and implementing next-generation cloud infrastructure solutions. You will drive cloud and data platform initiatives, ensure system scalability and security, and act as a technical leader, shaping the backbone of our customers’ mission-critical applications. Key Responsibilities Architect, design, and implement robust, scalable, and secure AWS cloud infrastructure utilizing services such as EC2, S3, Lambda, RDS, Redshift, and IAM. Lead the end-to-end design and deployment of high-performance, cost-efficient Databricks data pipelines, ensuring seamless integration with business objectives. Develop and manage data integration workflows using modern ETL tools in combination with Python and Java scripting. Collaborate with Data Engineering, DevOps, and Security teams to build resilient, highly available, and compliant systems aligned with operational standards. Act as a technical leader and mentor, guiding cross-functional teams through infrastructure design decisions and conducting in-depth code and architecture reviews. Oversee project planning, resource allocation, and deliverables, ensuring projects are executed on-time and within budget. Proactively identify infrastructure bottlenecks, recommend process improvements, and drive automation initiatives. Maintain comprehensive documentation and uphold security and compliance standards across the infrastructure landscape. Required Skills and Qualifications 8+ years of hands-on experience in IT infrastructure, cloud architecture, or related roles. Extensive expertise with AWS cloud services; AWS certifications are highly regarded. Deep experience with Databricks, including cluster deployment, Delta Lake, and machine learning integrations. Strong programming and scripting proficiency in Python and Java. Advanced knowledge of ETL/ELT processes and tools such as Apache NiFi, Talend, Airflow, or Informatica. Proven track record in project management, leading cross-functional teams; PMP or Agile/Scrum certifications are a plus. Familiarity with CI/CD workflows and Infrastructure as Code tools like Terraform and CloudFormation. Exceptional problem-solving, stakeholder management, and both written and verbal communication skills. Preferred Qualifications Experience with big data platforms such as Spark or Hadoop. Background in regulated environments (e.g., finance, healthcare). Knowledge of Kubernetes and AWS container orchestration (EKS). Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 06/09/2025 Job Type Full time Industry Technology Work Experience 5+ years City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600096 Job Description What you’ll be working on: Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity. Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Learn something new everyday What we are looking for: Bachelor's or master’s degree in technical or business discipline or related experience; Master's Degree preferred. 4+ years hands-on experience effectively managing data platforms, data tools and/or depth in data management technologies Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience with orchestration tools like Airflow. Experience with any of the ETL tools like Talend, Informatica etc. Experience in Data Warehouse solutions like Snowflake,Redshift. Exposure to data visualization tools (Tableau, Sisense, Looker, Metabase etc.) Knowledge of Github, JIRA is a plus. Familiar with data warehouse & data governance Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc) is a plus. Requirements Knowledge/Skills/Abilities/Behaviours: A “build-test-measure-improve” mentality and are driven to motivate and lead teams to achieve impactful deliverables Passion for operational efficiency, quantitative performance metrics and process orientation Working knowledge of project planning methodologies, IT standards and guidelines. Customer passion, business focus and the ability to negotiate, facilitate and build consensus. The ability to promote a team environment across a large set of separate agile teams and stakeholders Experience with or knowledge of Agile Software Development methodologies Benefits Work at SquareShift: We offer a stimulating atmosphere where your efforts will have a significant impact on our company’s success. We are a fun, client focussed, results-driven company that centers on developing high quality software, not work schedules and dress codes. We are driven by people who have a passion for technology, innovation and we are committed to continuous improvement. This role excites you to join our team? Apply on the link below!
Posted 1 week ago
10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-45392-2025 Description & Requirements Introduction: A Career at HARMAN HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About the Role We are seeking an experienced “Azure Data Architect” who will develop and implement data engineering project including enterprise data hub or Big data platform. Develop and implement data engineering project including data lake house or Big data platform What You Will Do Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as AWS 3/ AWS glue , AWS Redshift, Iceberg/parquet file format Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. What You Need 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Develop and implement data engineering project including data lakehouse or Big data platform Develop and implement data engineering project including data lakehouse or Big data platform What is Nice to Have Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Snowflake Certified in SnowPro Advanced Certification Ability to define reference data architecture Cloud native data platform experience in AWS or Microsoft stack Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders Build re-usable Methodologies, Pipelines & Models What Makes You Eligible Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.). Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Why Patients Need You The Revenue Management (RM) Digital Solutions team manages technology strategy and delivery for the Pfizer’s Commercial Business Units, the Global Access and Value (GAV) organization and Pfizer Commercial Finance teams. The team plays a critical role in ensuring Patients can easily access Pfizer’s breakthrough medicines through contracting opportunities with Payer and Providers. What You Will Achieve This role will be responsible to configure, maintain and enhance strategic technology platforms and related reporting dashboards that support market access pre-deal analytics, post-deal analytics and broader performance reporting. These web-based tools are used by Market Access Strategy, Pricing and Analytics (MASPA) teams that are responsible for payer and provider contract analytics. The candidate will be accountable for finding the most effective ways for technology to support Pfizer’s business objectives in these domains and optimize the return on all technology investments. The ideal candidate would be someone with strong technical skill on Data management tools, Data Architecture, data profiling/quality tools, AWS services, Tableau, Snowflake, Informatica Cloud Services, Dataiku AI/ML tools. This role will work closely with functional and technical leads and become a subject matter expert with solutions to identify, develop, and deploy processes and reporting. The role will lead Deal Analytics data integration, create analytic solutions, and utilize data prep and visualization platforms. How You Will Achieve It Evaluates and implements solutions to meet business requirements, ensuring consistent usage and adherence to data management best practices. Collaborates with product owners to prioritize features and manage technical requirements based on business needs, new technologies, and known issues. Develops application design and documentation for leadership teams. Assists in defining the vision for the shared data model, including sourcing, transformation, and loading approaches. Manages daily operations of the team, ensuring on-time delivery of milestones. Accountable for end-to-end delivery of program outcomes within budget, aligning with relevant business units. Fosters collaboration with internal and external stakeholders, including software vendors and data providers. Works independently with minimal supervision, capable of making recommendations. You will have the opportunity to: Demonstrate a solid ability to tell a story with simplistic views of complex datasets. Deliver data reliability, efficiency, and best-in-class data governance, ensuring security and compliance. Will be an integral part of developing best-in-class solution for the GAV organization. Build dashboard and reporting proofs of concept as needed; develop reporting and analysis templates and tools. Work in close collaboration with business teams throughout MASPA (Market Access Strategy Pricing and Analytics) to determine tool functionality/configuration and data requirements to ensure that the analytic capability is supporting the most current business needs. Partner with the Digital Client Partners to align on priorities, processes and governance and ensure experimentation to activate innovation and pipeline value. Qualifications Must-Have Bachelor’s degree in computer science, Software Engineering, or engineering related area. 5+ years of relevant experience emphasizing data modeling, development, or systems engineering. 1+ years with a data visualization tool (e.g. Tableau, Power BI). 2+ Years of experience in any number of the following tools, languages, and databases: (e.g. MySQL, SQL, Aurora DB, Redshift, Snowflake). Demonstrated capabilities in integrating and analyzing heterogeneous datasets; Ability to identify trends, identify outliers and find patterns. Demonstrated expertise and capabilities in matrixed, cross-functional teams and influencing without authority. Proven experience and demonstrated skills with AWS services, Tableau, Airflow,Python and Dataiku. Must be experienced in DevSecOps tools JIRA, GitHub. Experience in Database design tools. Deep knowledge of Agile methodologies and SDLC processes. Excellent written, interpersonal, and oral communication skills, communicate and liaise broadly across functions and the global organization. Strong analytical, critical thinking, and troubleshooting skills. Ambition to learn and utilize emerging technologies while working in a stimulating team environment. Nice-to-Have Advanced degree in Computer Engineering, Computer Science, Information Systems or related discipline. Knowledge with GenAI and LLMs framework (OpenAI, AWS). US Market Access functional knowledge and data literacy. Statistical analysis to understand and improve possible limitations in models. Experience in AI/ML frameworks. Pytest and CI/CD tools. Experience in UI/UX design. Experience in solution architecture & product engineering. Organizational Relationships Global Access and Value Organization Market Access Strategy, Pricing and Analytics Channel Management, Contract Operations Trade Operations Government Pricing Managed Markets Finance Vaccines Business Unit Leadership Commercial Leaders for newly launched brands AIDA Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less
Posted 1 week ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: We are seeking an MDM Associate Data Engineer with 2–5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark, Databricks, AWS etc), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience: Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong experience with Databricks and AWS architecture. Must have knowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having experience with Informatica or Reltio MDM platforms will be preferred. Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications: Any ETL certification (e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Show more Show less
Posted 1 week ago
3.0 - 8.0 years
8 - 18 Lacs
Bengaluru
Work from Office
Role & responsibilities Observability : Ensure end-to-end monitoring of pipelines, data services, infrastructure, and databases. Proactively detect and resolve issues to maintain system health. FinOps : Track and optimize platform usage and cost. Understand cost drivers, perform forecasting, and automate cost control measures. User Management : Manage onboarding, offboarding, and role-based access controls across tools including Tableau, Snowflake, and AWS. Privilege Access Management : Oversee and audit elevated access to critical systems in compliance with security policies. Application Maintenance : Perform regular maintenance, updates, and health checks on platform components to ensure operational stability. Service Desk Management : Triage and resolve incidents, service requests (SRs), and problems. Maintain the BAU roster and collaborate with cross-functional teams. Minor Enhancements : Address low-effort business enhancements (23 days) through a structured request process. Business Continuity Planning : Maintain and test Business Continuity Plans (e.g., Tableau DR) to ensure platform resilience. Deployment Services : Support production deployments, bug fixes, and enhancements in line with CI/CD pipelines. Data Load Fixes : Resolve failures in data ingestion due to scheduling, connectivity, infrastructure, or secret rotation issues. Transformations/Data Model Support : Provide Level 1 triage for issues arising from schema changes, malformed data, or source inconsistencies. Functional Data Questions : Perform initial triage for data requests or quality issues, and coordinate with domain-specific data analysts as needed. Project Support : Offer support for projects needing platform team involvement License Review : Participate in quarterly Tableau license reviews and ensure license compliance. Documentation : Maintain procedures, work instructions, and knowledge base (KB) articles for operational consistency and knowledge transfer. Preferred candidate profile 3+ years of experience in a Data Platform Support, DevOps, or Operations role. Hands-on experience with tools like Tableau, Snowflake, AWS, Informatica Cloud. Familiarity with ITSM practices (e.g., incident, problem, change management). Proficiency with Jira, CI/CD workflows, and monitoring tools. Strong documentation, communication, and stakeholder management skills.
Posted 1 week ago
12.0 - 15.0 years
0 Lacs
India
On-site
Data Analyst Experience : 12 to 15 Years Location : Bangalore, Pune, Hyderabad, Chennai, Noida, Kolkata, Mumbai Interview mode: 1st Virtual , L2 or HR round will be F2F Mandatory Skills - Data Analyst - Minimum 10 yr experience Required Data profiling - Minimum 5 yr experience Required SQL - Minimum 8 experience required Data Quality tools ( IDMC OR Alteryx ) - Minimum 5 yr experience Required JD Primary Skills: Strong proficiency in SQL Data Profiling and Data Quality tools (e.g., IDMC, Alteryx) Excellent communication and collaboration skills Secondary Skills: Experience with ADF, Snowflake, and Databricks Knowledge of Spark and any ETL tools (SSIS, Informatica, etc.) Power BI for data analysis and reporting Domain knowledge in Claims and Insurance Job Responsibilities: Act as the primary point of contact for platform-related inquiries Communicate platform updates and changes to relevant teams and stakeholders Collaborate effectively with multiple stakeholders across teams Facilitate coordination between development teams and other departments dependent on the platform Work within Agile practices to ensure timely and efficient project delivery Utilize data profiling and quality tools to ensure integrity and consistency of data Good to Have: Strong understanding of the Insurance domain and Claims processing Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us: ArcelorMittal was formed in 2006 from the strategic merger of European company Arcelor and Indian-owned Mittal Steel. Over a journey of two decades, we have emerged as the world's leading steel and mining company, exerting our influence across 60+ countries with a robust industrial footprint in 18. We are a global team of 158,00+ talented individuals committed to building a better world with smarter low-carbon steel. Our strategies are not just about scale; they're also about leading a transformative change where innovation meets sustainability. We supply to major global markets—from automotive and construction to household appliances and packaging—supported by world-class R&D and distribution networks. ArcelorMittal Global Business and Technologies in India is our new hub of technological innovation and business solutions. Here, you'll find a thriving community of business professionals and technologists who bring together diverse and unique perspectives and experiences to disrupt the global steel manufacturing industry. This fusion ignites groundbreaking ideas and unlocks new avenues for sustainable business growth. We nurture a culture fueled by an entrepreneurial spirit and a passion for excellence, which prioritizes the advancement and growth of our team members. With flexible career pathways and access to the latest technology and business tools, we offer a space where you can learn, take ownership, and face exciting challenges every day. Job title: Senior Engineer - Business Intelligence (BI) Developer (Tableau / Power BI) We are seeking a skilled and detail-oriented Business Intelligence (BI) Developer with 4 to 6 years of experience in developing and managing BI solutions using Tableau or Power BI. The ideal candidate will have strong analytical skills, a deep understanding of data modeling, and a proven ability to translate business requirements into interactive dashboards and insightful reports. Key responsibilities: Design, develop, and deploy BI solutions using Tableau or Power BI Collaborate with stakeholders to gather and understand business requirements Create data models, visualizations, dashboards, and reports to support decision-making Optimize dashboards for performance and usability Work with data teams to ensure data accuracy and integrity Develop and maintain documentation for BI solutions and processes Automate data flows and support data refresh schedules Perform ad-hoc data analysis and provide actionable insights Stay up-to-date with BI trends and tools, suggesting enhancements as needed Minimum skills: 4–6 years of experience in BI development, with a focus on Tableau or Power BI Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle, PostgreSQL) Proficiency in DAX (for Power BI) or Tableau calculations Experience with ETL processes and tools (e.g., SSIS, Alteryx, Informatica) Solid understanding of data warehousing and dimensional modeling Familiarity with cloud platforms such as Azure, AWS, or Google Cloud is a plus Excellent communication, problem-solving, and collaboration skills Demonstrable experience applying agile mythologies (SCRUM) Excellent communication and stakeholder collaboration abilities Strong analytical and problem-solving skills and an ability to navigate complexity Good team player Advanced level of English Preferred skills: Experience integrating BI tools with cloud data sources (e.g., Azure Synapse, Amazon Redshift) Knowledge of scripting languages like Python or R for advanced analytics Exposure to Agile / Scrum project environments Certification in Tableau or Power BI (optional but desirable) Familiarity with Alteryx Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us: ArcelorMittal was formed in 2006 from the strategic merger of European company Arcelor and Indian-owned Mittal Steel. Over a journey of two decades, we have emerged as the world's leading steel and mining company, exerting our influence across 60+ countries with a robust industrial footprint in 18. We are a global team of 158,00+ talented individuals committed to building a better world with smarter low-carbon steel. Our strategies are not just about scale; they're also about leading a transformative change where innovation meets sustainability. We supply to major global markets—from automotive and construction to household appliances and packaging—supported by world-class R&D and distribution networks. ArcelorMittal Global Business and Technologies in India is our new hub of technological innovation and business solutions. Here, you'll find a thriving community of business professionals and technologists who bring together diverse and unique perspectives and experiences to disrupt the global steel manufacturing industry. This fusion ignites groundbreaking ideas and unlocks new avenues for sustainable business growth. We nurture a culture fueled by an entrepreneurial spirit and a passion for excellence, which prioritizes the advancement and growth of our team members. With flexible career pathways and access to the latest technology and business tools, we offer a space where you can learn, take ownership, and face exciting challenges every day. Job Title: Engineer - Business Intelligence (BI) Developer (Tableau / Power BI) We are seeking a skilled and detail-oriented Business Intelligence (BI) Developer with 2 to 4 years of experience in developing and managing BI solutions using Tableau or Power BI. The ideal candidate will have strong analytical skills, a deep understanding of data modeling, and a proven ability to translate business requirements into interactive dashboards and insightful reports. Key responsibilities: Design, develop, and deploy BI solutions using Tableau or Power BI Collaborate with stakeholders to gather and understand business requirements Create data models, visualizations, dashboards, and reports to support decision-making Optimize dashboards for performance and usability Work with data teams to ensure data accuracy and integrity Develop and maintain documentation for BI solutions and processes Automate data flows and support data refresh schedules Perform ad-hoc data analysis and provide actionable insights Stay up-to-date with BI trends and tools, suggesting enhancements as needed Minimum skills: 2–4 years of experience in BI development, with a focus on Tableau or Power BI Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle, PostgreSQL) Proficiency in DAX (for Power BI) or Tableau calculations Experience with ETL processes and tools (e.g., SSIS, Alteryx, Informatica) Solid understanding of data warehousing and dimensional modeling Familiarity with cloud platforms such as Azure, AWS, or Google Cloud is a plus Excellent communication, problem-solving, and collaboration skills Demonstrable experience applying agile mythologies (SCRUM) Excellent communication and stakeholder collaboration abilities Strong analytical and problem-solving skills and an ability to navigate complexity Good team player Advanced level of English Preferred skills: Experience integrating BI tools with cloud data sources (e.g., Azure Synapse, Amazon Redshift) Knowledge of scripting languages like Python or R for advanced analytics Exposure to Agile/Scrum project environments Certification in Tableau or Power BI (optional but desirable) Familiarity with Alteryx Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Company Overview Viraaj HR Solutions is a leading recruitment firm dedicated to connecting top talent with their ideal workplace. Our mission is to streamline the hiring process for both employers and job seekers while ensuring a positive experience. We value integrity, professionalism, and excellence in our services. At Viraaj HR Solutions, we foster a culture of collaboration and continuous improvement, helping clients to find candidates who not only meet their technical needs but also align with their organizational values. Job Title: Informatica Developer Location: On-Site, India Role Responsibilities Design, develop, and maintain ETL processes using Informatica PowerCenter. Create and optimize mappings, workflows, and sessions based on business requirements. Collaborate with data architects to ensure data modeling and data flow align with project goals. Conduct performance tuning of ETL processes to enhance overall system efficiency. Implement and manage data quality assurance processes to ensure data integrity. Work closely with business analysts to gather requirements and understand data needs. Develop and manage SQL scripts for data extraction and transformation tasks. Monitor ETL processes and troubleshoot any issues that arise during execution. Document technical specifications and maintain project documentation for reference. Participate in code reviews to ensure best practices and adherence to standards. Assist in the upgrade and maintenance of existing Informatica environments. Utilize version control tools to manage code changes and collaborative efforts. Engage in Unix scripting for job scheduling and automation tasks as necessary. Provide technical support to end-users and stakeholders related to Informatica applications. Stay updated with the latest ETL tools and methodologies to improve data management processes. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Minimum 3 years of experience in Informatica development. Strong knowledge of SQL and relational databases. Experience with data warehousing concepts and methodologies. Familiarity with performance tuning techniques in ETL processes. Hands-on experience in designing mapping and workflows in Informatica. Proficient in data governance and data quality procedures. Understanding of Unix/Linux environments. Experience in version control systems. Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Ability to work independently and in a team setting. Experience with other ETL tools is a plus. Knowledge of Cloud technologies and integration is an advantage. Certification in Informatica is desirable but not mandatory. Skills: data warehousing,performance tuning,data quality assurance,version control,informatica developer,etl Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Overview Viraaj HR Solutions is a leading recruitment firm dedicated to connecting top talent with their ideal workplace. Our mission is to streamline the hiring process for both employers and job seekers while ensuring a positive experience. We value integrity, professionalism, and excellence in our services. At Viraaj HR Solutions, we foster a culture of collaboration and continuous improvement, helping clients to find candidates who not only meet their technical needs but also align with their organizational values. Job Title: Informatica Developer Location: On-Site, India Role Responsibilities Design, develop, and maintain ETL processes using Informatica PowerCenter. Create and optimize mappings, workflows, and sessions based on business requirements. Collaborate with data architects to ensure data modeling and data flow align with project goals. Conduct performance tuning of ETL processes to enhance overall system efficiency. Implement and manage data quality assurance processes to ensure data integrity. Work closely with business analysts to gather requirements and understand data needs. Develop and manage SQL scripts for data extraction and transformation tasks. Monitor ETL processes and troubleshoot any issues that arise during execution. Document technical specifications and maintain project documentation for reference. Participate in code reviews to ensure best practices and adherence to standards. Assist in the upgrade and maintenance of existing Informatica environments. Utilize version control tools to manage code changes and collaborative efforts. Engage in Unix scripting for job scheduling and automation tasks as necessary. Provide technical support to end-users and stakeholders related to Informatica applications. Stay updated with the latest ETL tools and methodologies to improve data management processes. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Minimum 3 years of experience in Informatica development. Strong knowledge of SQL and relational databases. Experience with data warehousing concepts and methodologies. Familiarity with performance tuning techniques in ETL processes. Hands-on experience in designing mapping and workflows in Informatica. Proficient in data governance and data quality procedures. Understanding of Unix/Linux environments. Experience in version control systems. Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Ability to work independently and in a team setting. Experience with other ETL tools is a plus. Knowledge of Cloud technologies and integration is an advantage. Certification in Informatica is desirable but not mandatory. Skills: data warehousing,performance tuning,data quality assurance,version control,informatica developer,etl Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Kochi, Kerala, India
On-site
Company Overview Viraaj HR Solutions is a leading recruitment firm dedicated to connecting top talent with their ideal workplace. Our mission is to streamline the hiring process for both employers and job seekers while ensuring a positive experience. We value integrity, professionalism, and excellence in our services. At Viraaj HR Solutions, we foster a culture of collaboration and continuous improvement, helping clients to find candidates who not only meet their technical needs but also align with their organizational values. Job Title: Informatica Developer Location: On-Site, India Role Responsibilities Design, develop, and maintain ETL processes using Informatica PowerCenter. Create and optimize mappings, workflows, and sessions based on business requirements. Collaborate with data architects to ensure data modeling and data flow align with project goals. Conduct performance tuning of ETL processes to enhance overall system efficiency. Implement and manage data quality assurance processes to ensure data integrity. Work closely with business analysts to gather requirements and understand data needs. Develop and manage SQL scripts for data extraction and transformation tasks. Monitor ETL processes and troubleshoot any issues that arise during execution. Document technical specifications and maintain project documentation for reference. Participate in code reviews to ensure best practices and adherence to standards. Assist in the upgrade and maintenance of existing Informatica environments. Utilize version control tools to manage code changes and collaborative efforts. Engage in Unix scripting for job scheduling and automation tasks as necessary. Provide technical support to end-users and stakeholders related to Informatica applications. Stay updated with the latest ETL tools and methodologies to improve data management processes. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Minimum 3 years of experience in Informatica development. Strong knowledge of SQL and relational databases. Experience with data warehousing concepts and methodologies. Familiarity with performance tuning techniques in ETL processes. Hands-on experience in designing mapping and workflows in Informatica. Proficient in data governance and data quality procedures. Understanding of Unix/Linux environments. Experience in version control systems. Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Ability to work independently and in a team setting. Experience with other ETL tools is a plus. Knowledge of Cloud technologies and integration is an advantage. Certification in Informatica is desirable but not mandatory. Skills: data warehousing,performance tuning,data quality assurance,version control,informatica developer,etl Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Viraaj HR Solutions is a leading recruitment firm dedicated to connecting top talent with their ideal workplace. Our mission is to streamline the hiring process for both employers and job seekers while ensuring a positive experience. We value integrity, professionalism, and excellence in our services. At Viraaj HR Solutions, we foster a culture of collaboration and continuous improvement, helping clients to find candidates who not only meet their technical needs but also align with their organizational values. Job Title: Informatica Developer Location: On-Site, India Role Responsibilities Design, develop, and maintain ETL processes using Informatica PowerCenter. Create and optimize mappings, workflows, and sessions based on business requirements. Collaborate with data architects to ensure data modeling and data flow align with project goals. Conduct performance tuning of ETL processes to enhance overall system efficiency. Implement and manage data quality assurance processes to ensure data integrity. Work closely with business analysts to gather requirements and understand data needs. Develop and manage SQL scripts for data extraction and transformation tasks. Monitor ETL processes and troubleshoot any issues that arise during execution. Document technical specifications and maintain project documentation for reference. Participate in code reviews to ensure best practices and adherence to standards. Assist in the upgrade and maintenance of existing Informatica environments. Utilize version control tools to manage code changes and collaborative efforts. Engage in Unix scripting for job scheduling and automation tasks as necessary. Provide technical support to end-users and stakeholders related to Informatica applications. Stay updated with the latest ETL tools and methodologies to improve data management processes. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Minimum 3 years of experience in Informatica development. Strong knowledge of SQL and relational databases. Experience with data warehousing concepts and methodologies. Familiarity with performance tuning techniques in ETL processes. Hands-on experience in designing mapping and workflows in Informatica. Proficient in data governance and data quality procedures. Understanding of Unix/Linux environments. Experience in version control systems. Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Ability to work independently and in a team setting. Experience with other ETL tools is a plus. Knowledge of Cloud technologies and integration is an advantage. Certification in Informatica is desirable but not mandatory. Skills: data warehousing,performance tuning,data quality assurance,version control,informatica developer,etl Show more Show less
Posted 1 week ago
5.0 - 6.0 years
4 - 8 Lacs
Bengaluru
On-site
Req ID: 325294 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Specialist to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Responsibilities: 5-6 years of application support experience supproing Dot Net and Azure apps. Ability to debug and coordinate with development teams to ensure efficient issue resolution. Monitoring Tools: Splunk, App Insight Management of deployments and addressing root causes for payment-related issues. Monitor Informatica batch job failures and provide insights for downstream dependencies. Proactively handle payment and application downtime issues Handle deployment monitoring and validations, escalating to the development team for resolution of critical issues when necessary. Shifts: Rotational 24X7 Mandatory Skills High level programming languages: C# (.NET MVC, .NET Core and .NET 6/7) UI: Angular, Javascript, CSS, ASP.NET MVC API: Restm Web API or Azure functions or Azure Durable Functions CI /CD: Azure pipelines, Terraform Scripting: Powershell, Bash Database: Microsoft SQL Server or NoSQL (e.g. CosmosDB) and Oracle Containerization: Azure Kubernetes Service, Kubernetes (open source) and Docker Agile knowledge About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 1 week ago
0 years
0 Lacs
Bengaluru
On-site
Job requisition ID :: 80948 Date: Jun 7, 2025 Location: Bengaluru Designation: Senior Consultant Entity: IDMC
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Area MDM CoE MDM Grade Associate/Sr. Associate # of people 9 Skill Set Informatica MDM Location Gurgaon / Bangalore YoE 4-7 Comments Should be able to Lead MDM Delivery as a solution architect and contribute in BD Mandatory skill sets- Informatica, MDM Preferred Skill Sets- Informatica, MDM Year of experience required- 4-8 Years Qualifications- BTech/MBA/MTech/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Informatica MDM Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About BNP Paribas Group BNP Paribas is a top-ranking bank in Europe with an international profile. It operates in 71 countries and has almost 199 000 employees. The Group ranks highly in its three core areas of activity: Domestic Markets and International Financial Services (whose retail banking networks and financial services are grouped together under Retail Banking & Services) and Corporate & Institutional Banking, centred on corporate and institutional clients. The Group helps all of its clients (retail, associations, businesses, SMEs, large corporates and institutional) to implement their projects by providing them with services in financing, investment, savings and protection. In its Corporate & Institutional Banking and International Financial Services activities, BNP Paribas enjoys leading positions in Europe, a strong presence in the Americas and has a solid and fast-growing network in the Asia/Pacific region. About BNP Paribas India Solutions Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, a leading bank in Europe with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 6000 employees, to provide support and develop best-in-class solutions. About Businessline/Function CIB Client Engagement and Protection IT having focus on applications servicing Client Lifecycle management, Due Diligence /KYC , Customer Relation Management, Service Request Management, Referential and Data Quality, PreTrade Transaction Screening and Anti-Money Laundering. Technologies being used include Java, .NET, Angular, Informatica, PowerBI, Fenergo, Siebel, Actimize, Camunda, Drools on private cloud infrastructure. Agile and DevSecOps practices are widely used. Landscape includes projects that are a mix of established and some under transition to new platforms. Job Title JAVA Technical Project Manager Date 20-June-24 Department CEP IT Location: Chennai Business Line / Function Client Engagement and Protection IT Reports To (Direct) Head of CRM & SRM Grade (if applicable) AVP/ VP (Functional) Number Of Direct Reports N/A Directorship / Registration NA Position Purpose We are seeking a highly skilled and experienced Technical Project Manager with a strong background in JAVA development to lead our Hobart Chennai project. The ideal candidate will have a deep understanding of software development methodologies, project management best practices, and JAVA technologies. The Technical Project Manager will be responsible for managing the entire SDLC development lifecycle, from requirements gathering to deployment, and will work closely with cross-functional teams to ensure successful project delivery. Run technical projects in the vertical which will include providing direction, project planning and tracking, reporting. Unblock technical challenges. Bring oversight and ownership. Work in globally distributed setup as a first among equals in the technology space. Responsibilities Direct Responsibilities Lead software development projects from inception to deployment, ensuring that projects are completed on time, within budget, and to the required quality standards. Owner of technical projects including planning, tracking and implementation. Work closely with cross-functional teams, including developers, QA engineers, business analysts, and stakeholders, to ensure that project requirements are clearly defined and understood. Develop and maintain project plans, schedules, and budgets, and track progress against these plans. Identify and manage project risks and issues, and develop contingency plans as needed. Ensure that project deliverables meet the required quality standards, and that all project documentation is complete and up-to-date. Communicate project status, risks, and issues to stakeholders and senior management, and provide regular project status reports. Mentor and coach team members and provide guidance and support as needed. Capacity planning, Leave planning, Recruitment, Successor planning. Leading scrum team. Performance assessment of team members Stay up-to-date with the latest JAVA technologies and development methodologies, and apply this knowledge to improve project delivery. Propose / Review/ Challenge Application Architecture and Design. Lead automation and guide teams to align with shift left and shift right strategy by encouraging a mindset for automation first and reduce recursive manual efforts Hands on and lead by example. Resolve performance bottlenecks. Keep up to date with latest technologies, trends and provide inputs, expertise and recommendations. Contributing Responsibilities Contribute towards innovation; suggest new practices for efficiency improvement. Upskilling of members in the vertical. Technical & Behavioral Competencies Strong communication skills both written and verbal. Strong leadership skills and ability to self-manage. Ability to prioritize and meet strict deadlines. Ability to communicate his/her ideas to the team and management. Inspire commitment of team members to deliver. Resourceful to quickly understand complexities involved and provide the way forward. Take ownership of the complex and challenging topics and find solutions. Strong knowledge about design patterns and development principles. Strong hands-on experience Core Java, J2EE, Spring framework, Spring Boot, Angular, PL\SQL or Oracle Strong hands-on knowledge on backend technologies Experience on Kubernetes, Microservices, Distributed Databases Practical experience on scalability, reliability, and stability of the application Architecturally enhancing / Re-designing applications which are already live. Experience of working with build tools like Maven & DevOps tools like Bitbucket, Git, Jenkins, SonarQube. Strong experience of Agile, Scrum, DevOps. Development experience of MVC Architecture based web applications, including creation of Web Services (RESTful APIs/ SOAP Services). Ability & willingness to learn & work on diverse technologies (languages, frameworks, and tools). Self-motivated, good interpersonal skills and inclination to constantly upgrade on new technologies and frameworks. Nice To Have Skills Worked in the area of Product Development and complex technical projects. Knowledge/experience on Dynatrace. Knowledge/experience on No SQL databases (MongoDB, Cassandra), Kafka. Some exposure to Caching technologies like Redis or Apache Ignite. Exposure to Client Management or financial domain. Industry related certifications e.g. TOGAF Skills Referential Specific Qualifications (if required) Behavioural Skills: (Please select up to 4 skills) Ability to synthetize / simplify Personal Impact / Ability to influence Attention to detail / rigor Ability to deliver / Results driven Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to manage / facilitate a meeting, seminar, committee, training… Ability to inspire others & generate people's commitment Ability to develop and leverage networks Ability to anticipate business / strategic evolution Education Level Bachelor Degree or equivalent Experience Level At least 15 years Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.