Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Finance Job Family Group: Business Support Group Job Description: About us: At bp, we’re reimagining energy for people and our planet. With operations working across almost every part of the energy system, we’re aiming the way in reducing carbon emissions and developing more sustainable methods for solving the energy challenge. We’re a team with multi-layered strengths of engineers, scientists, traders and business professionals determined to find answers to problems. And we know we can’t do it alone. We’re looking for people who share our passion for reinvention, to bring fresh opinions, ambition, and to challenge our thinking in our goal to achieve net zero! We believe our portfolio of businesses and investments in growth and transformation will result in a company with the scale, brand, capabilities, talent, and values to succeed as the digital revolution transforms our society, our industry and our planet ! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via data Bricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed important for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that foster into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further guidance/customer concern Communication skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements / maturity. Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in an evolving and changing setup and be able to identify priorities. Also ability to operate independently without too much direction. Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI Qlikview or Tableau etc.) Crafting analytics and insight in a DQ setting (powerBI/powerQuery) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level Certification in Data Management, Data Science, Python/R desirable Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Analytical Thinking, Commercial Acumen, Communication, Creativity and Innovation, Data Analysis, Decision Making, Digital fluency, Integration, Managing strategic partnerships, Research and insights, Risk Management, Stakeholder Engagement, Stakeholder Management, Sustainability awareness and action Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 2 weeks ago
3.0 - 7.0 years
3 - 5 Lacs
Hyderābād
On-site
About Alphanext Alphanext is a global talent solutions company with offices in London, Pune, and Indore. We connect top-tier technical talent with forward-thinking organizations to drive innovation and transformation through technology. Position Summary Alphanext is hiring an experienced Informatica Developer with a strong background in data warehousing and data integration using Informatica PowerCenter and IDMC platforms. This role requires hands-on expertise in integrating diverse data sources, working with enterprise relational databases, and delivering robust and scalable ETL solutions. Experience in mainframe and Salesforce integration is highly desirable. Key Responsibilities Design and develop data integration workflows using Informatica PowerCenter / IDMC . Implement data warehousing strategies in accordance with enterprise standards. Integrate data from Mainframe, Salesforce , and other new-age data sources using PowerExchange . Work with Informatica Web Services , Java Transformations , and XML for dynamic data processing. Collaborate with database teams to manage and optimize data in Oracle and SQL Server environments. Write and maintain UNIX/Linux shell scripts to support ETL processes. Ensure data accuracy, consistency, and performance through robust SQL scripting and testing. Required Skills 3 to 7 years of experience in Informatica PowerCenter and IDMC on data integration or warehousing projects. Expert knowledge of data warehousing concepts , standards, and tools. Experience in Informatica PowerExchange for Mainframe, Salesforce, and other modern data sources. Working knowledge of Informatica Web Services , XML , and Java Transformations . Strong understanding of relational databases, especially Oracle and SQL Server . Good proficiency in UNIX/Linux shell scripting . Strong SQL programming skills. Preferred Skills Exposure to AWS services including EC2, S3, and AWS Glue ETL . Hands-on experience with TWS / Tidal job schedulers . Familiarity with the BFSI domain . Experience in Informatica Cloud (nice to have). Qualifications Bachelor//'s degree in Computer Science, Information Technology, or related field. 3 to 7 years of experience in ETL development and data integration.
Posted 2 weeks ago
0 years
0 Lacs
India
On-site
The Consulting Data Engineer role requires experience in both traditional warehousing technologies (e.g. Teradata, Oracle, SQL Server) and modern database/data warehouse technologies (e.g., AWS Redshift, Azure Synapse, Google Big Query, Snowflake), as well as expertise in ETL tools and frameworks (e.g. SSIS, Azure Data Factory, AWS Glue, Matillion, Talend), with a focus on how these technologies affect business outcomes. This person should have experience with both on-premise and cloud deployments of these technologies and in transforming data to adhere to logical and physical data models, data architectures, and engineering a dataflow to meet business needs. This role will support engagements such as data lake design, data management, migrations of data warehouses to the cloud, and database security models, and ideally should have experience in a large enterprise in these areas. Develops high performance distributed data warehouses, distributed analytic systems and cloud architectures Participates in developing relational and non-relational data models designed for optimal storage and retrieval Develops, tests, and debugs batch and streaming data pipelines (ETL/ELT) to populate databases and object stores from multiple data sources using a variety of scripting languages; provide recommendations to improve data reliability, efficiency and quality Works along-side data scientists, supporting the development of high-performance algorithms, models and prototypes Implements data quality metrics, standards, guidelines; automates data quality checks / routines as part of data processing frameworks; validates flow of information Ensures that Data Warehousing and Big Data systems meet business requirements and industry practices including but not limited to automation of system builds, security requirements, performance requirements and logging/monitoring requirements , Knowledge, Skills, And Abilities Ability to translate a logical data model into a relational or non-relational solution Expert in one or more of the following ETL tools: SSIS, Azure Data Factory, AWS Glue, Matillion, Talend, Informatica, Fivetran Hands on experience in setting up End to End cloud based data lakes Hands-on experience in database development using views, SQL scripts and transformations Ability to translate complex business problems into data-driven solutions Working knowledge of reporting tools like Power BI , Tableau etc Ability to identify data quality issues that could affect business outcomes Flexibility in working across different database technologies and propensity to learn new platforms on-the-fly Strong interpersonal skills Team player prepared to lead or support depending on situation"
Posted 2 weeks ago
3.0 - 6.0 years
4 - 8 Lacs
Pune
Work from Office
At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your Role Should have 5+ years of experience in Informatica PowerCenter Strong knowledge of ETL concepts, Data Warehouse architecture and best practices Should be well versed with different file Formats for parsing files like Flat File, XML, JSON, and various source systems for integration. Must have hands-on development as individual contributor in at least 2 Project Lifecycles (Data Warehouse/Data Mart/Data Migration) with client facing environment Your Profile Design, develop, Unit Testing, Deployment, Support Data applications and Infrastructure utilizing various technologies to process large volumes of data. Strong Technical and Functional understanding of RDBMS DWH-BI knowledge. Should have implemented- error handling, exception handling and Audit balance control framework. Good knowledge either Unix/Shell or Python scripting and scheduling tool Strong SQL, PL/SQL Skills, data analytics and performance tuning capabilities. Good to have knowledge of Cloud platform and technologies What You'll love about working here We recognize the significance of flexible work arragemnets to provide support.Be it remote work, or flexible work hours. You will get an enviorment to maintain healthy work life balance. At the heart of our misssion is your career growth. our Array of career growth programs and diverse professions are crafted to spport you in exploring a world of opportuneties Euip Yourself with valulable certification in the latest technlogies such as unix,Sql. Skills (competencies)
Posted 2 weeks ago
5.0 years
4 - 6 Lacs
Gurgaon
Remote
Our story At Alight, we believe a company’s success starts with its people. At our core, we Champion People, help our colleagues Grow with Purpose and true to our name we encourage colleagues to “Be Alight.” Our Values: Champion People – be empathetic and help create a place where everyone belongs. Grow with purpose – Be inspired by our higher calling of improving lives. Be Alight – act with integrity, be real and empower others. It’s why we’re so driven to connect passion with purpose. Our team’s expertise in human insights and cloud technology gives companies and employees around the world the ability to power confident decisions, for life. With a comprehensive total rewards package, continuing education and training, and tremendous potential with a growing global organization, Alight is the perfect place to put your passion to work. Join our team if you Champion People, want to Grow with Purpose through acting with integrity and if you embody the meaning of Be Alight. Learn more at careers.alight.com. ETL Developer Alight is seeking a skilled and passionate ETL Software Developer to join our team. As the ETL Developer, you will be a member of a team responsible for various stages of software development, including understanding business requirements, coding, testing, documentation, deployment, and production support. As part of the ETL development team, you will focus on delivering high-quality enterprise caliber systems on Informatica PowerCenter, focused on source and targets to/from flat files, MS Dynamics CRM and Microsoft SQL Server. Your primary role will involve participating in full life-cycle data integration development projects. Qualifications: Knowledge & Experience: 5+ years of data integration, data warehousing, or data conversion experience. 3+ years of SQL writing and optimization experience. 3+ years of Informatica PowerCenter experience. 2+ years working with Microsoft SQL Server Management Studio. Experience with XML file data integration. Experience with UNIX shell scripting. Experience with Microsoft Dynamics or other CRM system preferred. Strong understanding of using ETL tools to integrate internal and third-party systems. Excellent analytical and critical thinking skills Strong interpersonal skills with the ability to work effectively with diverse and remote teams Experience in agile processes and development task estimation Strong sense of responsibility for deliverables Ability to work in a small team with moderate supervision Responsibility Areas: Design software solutions for small to medium complexity requirements independently, adhering to existing standards Develop high-priority and highly complex code for systems based on functional specifications, detailed design, maintainability, and coding and efficiency standards, working independently Estimate and evaluate risks, and prioritize technical tasks based on requirements Collaborate actively with ETL Lead, Product Owners, Quality Assurance, and stakeholders to ensure high-quality project delivery Conduct formal code reviews to ensure compliance with standards Utilize appropriately system design, development, and process standards Write and execute unit test cases to verify basic functionality, both for your own code and that of your peers Create, maintain, and publish system-level documentation, including system diagrams, with minimal guidance Ensure clarity, conciseness, and completeness of requirements before starting development, collaborating with Business Analysts and stakeholders to evaluate feasibility. Take primary accountability for meeting non-functional requirements. Education: Bachelor's degree (with preferred concentrations in Computer Science, MIS, Engineering) or equivalent work experience. Master’s Degree in related area preferred. Computer application certifications, as applicable. Flexible Working So that you can be your best at work and home, we consider flexible working arrangements wherever possible. Alight has been a leader in the flexible workspace and “Top 100 Company for Remote Jobs” 5 years in a row. Benefits We offer programs and plans for a healthy mind, body, wallet and life because it’s important our benefits care for the whole person. Options include a variety of health coverage options, wellbeing and support programs, retirement, vacation and sick leave, maternity, paternity & adoption leave, continuing education and training as well as a number of voluntary benefit options. By applying for a position with Alight, you understand that, should you be made an offer, it will be contingent on your undergoing and successfully completing a background check consistent with Alight’s employment policies. Background checks may include some or all the following based on the nature of the position: SSN/SIN validation, education verification, employment verification, and criminal check, search against global sanctions and government watch lists, credit check, and/or drug test. You will be notified during the hiring process which checks are required by the position. Our commitment to Diversity and Inclusion Alight is committed to diversity, equity, and inclusion. We celebrate differences and believe in fostering an environment where everyone feels valued, respected, and supported. We know that diverse teams are stronger, more innovative, and more successful. At Alight, we welcome and embrace all individuals, regardless of their background, and are dedicated to creating a culture that enables every employee to thrive. Join us in building a brighter, more inclusive future. Diversity Policy Statement Alight is an Equal Employment Opportunity employer and does not discriminate against anyone based on sex, race, color, religion, creed, national origin, ancestry, age, physical or mental disability, medical condition, pregnancy, marital or domestic partner status, citizenship, military or veteran status, sexual orientation, gender, gender identity or expression, genetic information, or any other legally protected characteristics or conduct covered by federal, state or local law. In addition, we take affirmative action to employ and advance in the employment of qualified minorities, women, disabled persons, disabled veterans and other covered veterans. Alight provides reasonable accommodations to the known limitations of otherwise qualified employees and applicants for employment with disabilities and sincerely held religious beliefs, practices and observances, unless doing so would result in undue hardship. Applicants for employment may request a reasonable accommodation/modification by contacting his/her recruiter. Authorization to work in the Employing Country Applicants for employment in the country in which they are applying (Employing Country) must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the Employing Country and with Alight. Note, this job description does not restrict management's right to assign or reassign duties and responsibilities of this job to other entities; including but not limited to subsidiaries, partners, or purchasers of Alight business units. We offer you a competitive total rewards package, continuing education & training, and tremendous potential with a growing worldwide organization. DISCLAIMER: Nothing in this job description restricts management's right to assign or reassign duties and responsibilities of this job to other entities; including but not limited to subsidiaries, partners, or purchasers of Alight business units. .
Posted 2 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Surat
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 2 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Varanasi
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 2 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Visakhapatnam
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications.
Posted 2 weeks ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position: Lead Data Engineer Location: Chennai Work Hours: 12:00 PM – 9:00 PM IST (US Business Hours) Availability: Immediate Experience: 6+ Years About the Company: Ignitho Inc. is a leading AI and data engineering company with a global presence, including offices in the US, UK, India, and Costa Rica. Visit our website to learn more about our work and culture: www.ignitho.com . Ignitho is a portfolio company of Nuivio Ventures Inc., a venture builder dedicated to developing Enterprise AI product companies across various domains, including AI, Data Engineering, and IoT. Learn more about Nuivio at: www.nuivio.com . Job Summary We are looking for a highly skilled Lead Data Engineer to drive the delivery of data warehousing, ETL, and business intelligence solutions. This role involves leading cloud data migrations, building scalable data pipelines, and serving as the primary liaison for client engagements. Roles & Responsibilities Lead end-to-end delivery of data warehousing and BI solutions. Act as SPOC for project engagements with cross-functional and international teams. Design and implement batch, near real-time, and real-time data pipelines. Build scalable data platforms using AWS (Lambda, Glue, Athena, S3). Develop and optimise SQL scripts, Python code, and shell scripts for automation. Translate business requirements into actionable data insights and reports. Manage Agile project execution , ensuring timely and high-quality deliverables. Perform root cause analysis, performance tuning, and data profiling. Lead legacy data migration to modern cloud platforms. Collaborate on data modelling and BI solution design. Preferred Skills Strong Python programming for ETL, APIs, and AWS Lambda. Advanced SQL scripting and performance tuning. Proficiency in Unix/Linux shell scripting. Hands-on experience with AWS services (Lambda, Glue, Athena, S3, CloudWatch). Familiarity with GitLab, Bitbucket, or CircleCI . Experience in digital marketing data environments is a plus. Knowledge of Azure/GCP and tools like Informatica is preferred. Exposure to real-time streaming and data lake architectures. Qualifications Bachelor’s degree in computer science, MCA, or related field. 6+ years of experience in data engineering and BI. At least 5 years of project leadership experience. AWS/Data Warehousing certifications are a plus.
Posted 2 weeks ago
7.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Job Title: Consultant/ Sr.Consultant – DW&BI | IT CONSULTING Job Location: Bangalore/Kolkata As a Sr. Consultant… You are responsible for delivering an end-to-end solution from the initial concept through finished solution. You’ll need the ability to quickly understand needs, from a customer’s perspective, and move from there in providing appropriate solutions that capture the essence of customers’ needs. The ability to not only code but also assemble and integrate technology across disparate platforms is critical to success. This includes a deep understanding of systems, modern scripting and enterprise level languages as well as open-source tools that can and should be leveraged to solve problems. You’ll work in a rapid environment where there aren’t always clear specifications or rules about how something should be done, where it’s up to you to figure things out and keep things moving. Who you are… We are seeking a skilled and experienced Senior Informatica Intelligent Cloud Services (IICS) Developer to join our data engineering team. The ideal candidate will have extensive experience in developing and managing data integration workflows using IICS, and a solid understanding of cloud data ecosystems, preferably Google Cloud Platform (GCP) and Google BigQuery (GBQ). Required Experience • Design, develop, and maintain ETL/ELT workflows using Informatica IICS. • Collaborate with business and technical teams to understand requirements and translate them into robust data integration solutions. • Optimize data pipelines for performance and scalability. • Integrate IICS solutions with cloud-based data stores like Google BigQuery and cloud storage solutions. • Develop data mappings, task flows, parameter files, and reusable objects. • Manage deployments, migrations, and version control for IICS assets. • Perform unit testing, debugging, and troubleshooting of ETL jobs. • Document data flow and architecture as part of the SDLC. • Work in an Agile environment and participate in sprint planning, reviews, and retrospectives. • Provide mentorship and code reviews for junior developers, ensuring adherence to best practices and coding standards. Skills & Qualifications: • Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. • 7+ years of experience in ETL development with at least 2–3 years in Informatica IICS. • Strong experience in data integration, transformation, and orchestration using IICS. • Good working knowledge of cloud data platforms, preferably Google Cloud Platform (GCP). • Hands-on experience with Google BigQuery (GBQ) including writing SQL queries, data ingestion, and optimization. • Strong SQL skills and experience with RDBMS (e.g., Oracle, SQL Server, PostgreSQL). • Experience in integrating data from various sources including on-prem, SaaS applications, and cloud data lakes. • Familiarity with data governance, data quality, and data cataloging tools. • Understanding of REST APIs and experience with API integration in IICS. • Excellent problem-solving skills and attention to detail. • Strong communication skills and the ability to work effectively in a team You are responsible for delivering an end-to-end solution from the initial concept through finished solution. You’ll need the ability to quickly understand needs, from a customer’s perspective, and move from there in providing appropriate solutions that capture the essence of customers’ needs. The ability to not only code but also assemble and integrate technology across disparate platforms is critical to success. This includes a deep understanding of systems, modern scripting and enterprise level languages as well as open-source tools that can and should be leveraged to solve problems. You’ll work in a rapid environment where there aren’t always clear specifications or rules about how something should be done, where it’s up to you to figure things out and keep things moving.
Posted 2 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Lucknow
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 2 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Ludhiana
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 2 weeks ago
7.0 - 12.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Experience in Finance or Investment Banking Domain 7-9 years of experience as a Data Analyst or similar role supporting data analytics projects. 5+ years of Mastery in SQL. 5+ years of experience in financial services, insurance, or related industry. Experience with data manipulation using Python. Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, Pricing. Investment Operations exposure - Critical Data Elements (CDE), data traps and other data recons. Familiarity with data engineering concepts: ETL/ELT, data lakes, data warehouses. Experience with BI tools like Power BI, MicroStrategy, Tableau. Excellent communication, problem-solving, and stakeholder management skills. Experience in Agile/Scrum and working with cross-functional delivery teams. Proficiency in financial reporting tools (e.g., Power BI, Tableau).
Posted 2 weeks ago
5.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
✅ Job Title: Informatica MDM Developer 🕒 Experience Required: 5+ Years Location: Mumbai, hyderabad, bengaluru, pune, chennai 🔍 Job Description: We are seeking a skilled Informatica MDM Developer with over 5 years of hands-on experience in implementing Informatica Master Data Management (MDM) solutions. The ideal candidate must possess relevant certifications and deep implementation experience, with the ability to work independently and contribute effectively to enterprise MDM initiatives. 🎯 Key Responsibilities: Lead and participate in end-to-end Informatica MDM implementations. Design and develop data models, match & merge rules, hierarchies, and workflows in Informatica MDM. Collaborate with stakeholders to understand business requirements and translate them into MDM solutions. Perform unit testing, performance tuning, and deployment support. Provide post-deployment support and enhancements for existing MDM systems. 🔧 Primary Must-Have Skills (Non-Negotiable): Minimum 5+ years of hands-on Informatica MDM implementation experience. Mandatory certification in Informatica MDM (Developer / Implementation Specialist). Strong knowledge of Informatica Hub, IDD, SIF API, and User Exits. Proficient in data cleansing, matching, merging, and hierarchy management. Solid understanding of data quality and data governance concepts.
Posted 2 weeks ago
6.0 - 11.0 years
20 - 30 Lacs
Hyderabad
Hybrid
Role & responsibilities 7-9 years of experience as a Data Analyst or similar role supporting data analytics projects. 5+ years of Mastery in SQL. 5+ years of experience in financial services, insurance, or related industry. Experience with data manipulation using Python. Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, Pricing. Investment Operations exposure - Critical Data Elements (CDE), data traps and other data recons. Familiarity with data engineering concepts: ETL/ELT, data lakes, data warehouses. Experience with BI tools like Power BI, MicroStrategy, Tableau. Excellent communication, problem-solving, and stakeholder management skills. Experience in Agile/Scrum and working with cross-functional delivery teams
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are seeking an experienced Oracle Data Integrator (ODI) Developer with in-depth expertise in Oracle GoldenGate to join our team. The ideal candidate will have over 5 years of hands-on experience in building, managing, and optimizing data integration processes. This role requires working with large-scale data environments, facilitating seamless data migrations, and supporting real-time integration strategies. Key Responsibilities Design, develop, and implement data integration solutions using ODI and Oracle GoldenGate. Build and maintain ETL processes for data migration, transformation, and integration. Design real-time data replication setups using GoldenGate across heterogeneous systems. Optimize and troubleshoot data pipelines for maximum performance and reliability. Collaborate with cross-functional teams to translate business needs into technical solutions. Coordinate with DBAs and infrastructure teams to ensure smooth integration and system performance. Manage data warehousing, synchronization, and migration tasks, ensuring data integrity. Support GoldenGate replication for high availability, disaster recovery, and data synchronization. Ensure the scalability, performance, and security of integration configurations. Develop technical documentation and provide training on ODI and GoldenGate processes. Support production environments and troubleshoot data integration issues as needed. Required Skills And Qualifications 5+ years of hands-on experience in ODI development and implementation. Proven expertise in Oracle GoldenGate including real-time replication and conflict resolution. Strong command of SQL, PL/SQL, and scripting for data manipulation. Solid understanding of data modeling, ETL architecture, and multi-system integration. Familiarity with Oracle databases and data warehousing concepts. Experience with ODI components such as Interfaces, Mappings, Packages, and Procedures. Proficient in configuring GoldenGate for high availability and disaster recovery. Excellent troubleshooting and optimization skills for data pipelines. Experience handling complex data migration and synchronization tasks. Ability to thrive in a fast-paced, client-facing environment. Preferred Skills Familiarity with other ETL tools (e.g., Informatica, Talend). Knowledge of Oracle Cloud Infrastructure (OCI) or other cloud platforms. Certifications in ODI, GoldenGate, or other Oracle technologies. Experience with performance tuning in large-scale data integration projects. Educational Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Relevant Oracle certifications (ODI, GoldenGate) are a plus. Skills: data modeling,large-scale data integration,oracle goldengate,scripting,data integration,oracle,data synchronization,oracle data integrator (odi),odi,pl/sql,data migration,etl,sql,performance tuning,etl architecture,data warehousing,oracle cloud infrastructure,oci,goldengate
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a strong ETL, SQL, and Unix support resource working from ODC at CGI, you will be responsible for providing production support with a focus on ETL (Informatica) and Oracle database. Your essential skills will include a good understanding of Unix and SQL, along with knowledge of the CA7 scheduling tool. You will be required to prepare requirement definitions, designs, and technical specifications, as well as provide coding, testing, and implementation support for the identified technical platform. Analyzing user requirements, defining technical project scope, and creating business and/or technical designs for new systems or modifications will also be part of your responsibilities. In this role, you will have the opportunity to turn meaningful insights into action by working collaboratively with a team of CGI Partners. At CGI, ownership, teamwork, respect, and belonging are key values that drive our work culture. From day one, you will be encouraged to take ownership and contribute to bringing our collective vision to life. As a CGI Partner, you will have a stake in the company's success and play an active role in shaping its strategy and direction. Your work will be instrumental in creating value through innovative solutions and building strong relationships with teammates and clients. You will have access to global capabilities that will enable you to scale your ideas, explore new opportunities, and benefit from extensive industry and technology expertise. By joining CGI, you will have the chance to shape your career in a company that is committed to growth and longevity. Our leaders prioritize your health and well-being, providing you with opportunities to enhance your skills and broaden your horizons. Join CGI, one of the largest IT and business consulting services firms globally, and be part of a dynamic team dedicated to making a difference in the world of technology and innovation.,
Posted 2 weeks ago
5.0 years
15 - 25 Lacs
Gurugram, Haryana, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica
Posted 2 weeks ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Req ID: 323795 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Tableau Admin to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Tableau Admin Reporting to the Manager of Business Intelligence, the position is to provide application deployment support, maintenance, modernization, and primary research of technical issues. The Tableau Administrator may also assist with special projects for clients such as researching data issues, researching and implementing new integrations, or developing and implementing processes to manipulate data per customer requests. Primary Duties & Responsibilities Install, configure, and maintain a multi- tier Tableau infrastructure environment to ensure reliable performance and scalability Experience with automation for tasks like deployment of data sources/workbooks across environment, alerts via scripting (PowerShell). Assist in development of workbooks and dashboards to support business initiatives Experience in embedded analytics environment and integration of tableau and Salesforce Experienced with Tableau server Metadata and Admin views Manage and support migration of workbooks, dashboards from development to production environments. Experience in Landing Pages and applying Role Level Security to data source Work closely with Tableau vendor for product fix and enhancements Develop and document standards/best practices for development and administration of the Tableau platform Monitor server activity/usage statistics to identify possible performance issues/enhancements. Work with cross-functional teams on the day-to-day execution of data & analytics projects and initiatives Work very closely with Data Architects, DBAs, Security team in setting up the environment that meets the needs of users Troubleshoot and optimize Tableau Dashboards/Workbooks, extraction, etc. Build best practices and performance guides Solving user issues faced in different Tableau applications developed in Tableau Desktop Work with Cloud Computing Platforms (AWS) and other open-source technologies Good understanding of Active Directory, DNS, Load Balancer and Network firewall Experience configuring Single Sign-on (SSO) with SAML authentication and identity provider (IdP) Experience in data source like Redshift, Athena, Aurora, MS SQL Server Good Experience and writing SQL queries Demonstrate high personal standards of behavior in a professional environment. Demonstrate credibility, competence, and very proactive. Self-motivated; willingness & strong desire to learn Hands on experience in Installing and configuring QlikView Server and Publisher Maintaining Server Logs, QlikView Server Backup Required Skills Supported Applications: Tableau (Admin/Desktop/Prep), QlikView and Informatica Primary skill should be Tableau Administration with some Tableau development experience Knowledge on QlikView and Informatica power center/ cloud will be preferred. Good experience on SQL queries is preferred AWS cloud experience preferred Ability to task switch effectively based on business priorities Team lead with strong communication and interpersonal skills Strong organizational skills along with demonstrated ability to manage multiple tasks simultaneously Communicate effectively with technical/non-technical audiences Self-motivated and Driven Can-do attitude: contributing to key improvements and innovations Required Knowledge & Experience Education & Work Experience required Bachelor's degree from College or University with at least 2 years of focused Tableau administration experience and total 4+ years of Tableau experience in an IT Department, preferably within a Business Intelligence Team and/or equivalent combination of education and work experience. Strong technical, analytical and communication skills Insurance industry experiences in life, health and annuity a plus About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 2 weeks ago
5.0 years
15 - 25 Lacs
Chennai, Tamil Nadu, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Java, Spring, Spring Boot, Kafka, Rest API, Microservices, Azure, CD - Developer at NTT DATA in Chennai, Tamil Nadu (IN-TN), India, you will be part of a dynamic team that values exceptional, innovative, and passionate individuals who are eager to grow with the organization. You will be responsible for utilizing your hands-on experience in Java, Spring, Springboot, Kafka, and designing robust RESTful APIs and Microservices. Additionally, you will work with technologies such as Hashicorp Vault, Terraform, Packer, Kubernetes, and cloud platforms like Azure. Your expertise in API Management, cloud infrastructure deployment, CD processes, testing frameworks, and modern programming languages will be essential for success in this role. To excel in this position, you should have a Bachelor's degree in computer science or engineering, along with at least 6-9 years of experience in Java, Springboot, Oracle, Kubernetes, and other relevant technologies. Your strong communication skills, leadership experience, and ability to collaborate with cross-functional teams will be valuable assets. Moreover, your strategic thinking, problem-solving skills, and familiarity with ITIL processes will contribute to the continuous improvement of software solutions. As a key member of the team, you will play a crucial role in driving meaningful discussions, staying updated on technology trends, and implementing CI/CD practices to deploy changes efficiently. Your willingness to work in a hybrid environment, including the client location at Ramanujam IT Park, Taramani, Chennai, and your commitment to a return to the office by 2025, will align with the general expectations of the role. If you are ready to take on this challenging yet rewarding opportunity with NTT DATA, a trusted global innovator in business and technology services, apply now to be part of a diverse team that is dedicated to helping clients innovate, optimize, and transform for long-term success.,
Posted 2 weeks ago
5.0 years
15 - 25 Lacs
Greater Kolkata Area
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica
Posted 2 weeks ago
5.0 years
15 - 25 Lacs
Pune, Maharashtra, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Skills: dwh,gcp,aws,snowflake,airflow,snowpipe,data analysis,sql,data architect,tableau,performence tuning,pipelines,oracle,etl,data modeling,azure,python,dbt,azkaban,power bi,fivetran,sigma computing,data warehousing,luigi,informatica
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
You will have the opportunity to work as an individual contributor and maintain positive relationships with stakeholders. It is essential to be proactive in learning new skills as per business requirements. Your responsibilities will include extracting relevant data, cleansing, and transforming data to generate insights that drive business value. This will involve utilizing data analytics, data visualization, and data modeling techniques effectively. Your Impact - Strong proficiency in MS PowerBI. - Hands-on experience with MS Azure SQL Database. - Proficient in developing ETL packages using Visual Studio or Informatica. - Skilled in data analysis and business analysis. - Expertise in database management and reporting, particularly in SQL & MS Azure SQL. - Strong critical-thinking and problem-solving abilities. - Excellent verbal and written communication skills. - Review and validate customer data during collection. - Supervise the deployment of data to the data warehouse. - Collaborate with the IT department for software and hardware upgrades to support big data use cases. - Monitor analytics and metrics results. - Implement new data analysis methodologies. - Conduct data profiling to identify and understand anomalies. - Good to have knowledge of Python/R. About You - Possess 2 to 5 years of experience in PowerBI. - Hold a Technical Bachelor's Degree. - Non-Technical Degree holders should have 3+ years of relevant experience. We value inclusion and diversity at Gallagher. It is an integral part of our business, reflecting our commitment to sustainability and supporting the communities where we operate. Embracing the diverse identities, experiences, and talents of our employees enables us to better serve our clients and communities. Inclusion is a conscious commitment, and diversity is recognized as a vital strength. Through embracing diversity in all its forms, we embody The Gallagher Way to its fullest. Equal employment opportunities are extended across all aspects of the employer-employee relationship, including recruitment, hiring, training, promotion, transfer, demotion, compensation, benefits, layoff, and termination. Gallagher is committed to making reasonable accommodations for known physical or mental limitations of qualified individuals with disabilities, unless such accommodations would impose undue hardship on our business operations.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As an experienced professional with 8-12 years of experience in Database technologies, scripting, BI tools, and Data Science, you will be responsible for understanding customer requirements, documenting technical designs, and developing ETL jobs, reports, dashboards, and data marts. You will play a key role in translating business requirements into effective designs and implementing solutions using a variety of tools and technologies. Your responsibilities will include: - Understanding customer requirements, problem statements, and business use cases. - Documenting customer requirements and technical designs. - Developing and testing ETL jobs, query objects, data marts, reports, and dashboards. - Designing OLAP cubes, SQL reports, interactive charts, and dashboards. - Writing stored procedures, functions, packages, scripts, and complex SQL queries. - Reviewing existing report designs, database schemas, and SQL code to enhance performance and operational aspects. - Collaborating closely with customers for solution fine-tuning and troubleshooting production issues. - Monitoring tasks efficiently and providing status reports to senior management, customers, and stakeholders. - Demonstrating problem analysis and resolution skills, including debugging at the OS, database, or network level if necessary. - Possessing an understanding of cloud deployment architecture and Big Data platforms as an added advantage. To excel in this role, you must have hands-on experience with various database technologies such as Oracle, MS SQL Server, MySQL, and PostgreSQL. Additionally, proficiency in scripting languages like Java, Python, and R is required. Experience with contemporary reporting and BI tools like Crystal Reports, Tableau, and PowerBI is essential. Knowledge of ETL platforms like Informatica and SSIS, as well as data modeling and data warehouse design, is crucial. Your expertise in data science, AI, and ML will be utilized for designing and implementing data-driven use cases. Strong communication skills, problem-solving abilities, and familiarity with project management concepts and Agile delivery methodology are necessary for successful project execution. If you are seeking a challenging opportunity to leverage your technical skills and contribute to innovative solutions in the field of data analytics and business intelligence, this role offers a dynamic environment where you can thrive and make a significant impact.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France