Jobs
Interviews

47 Datawarehouse Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

karnataka

On-site

In this role, you will be responsible for implementing and maintaining high-quality Datawarehouse and reporting solutions with over 7 years of experience. Your contributions will adhere to best practices, ensuring performance, maintainability, and scalability of the platform. You will work in a collaborative environment with business and IT stakeholders throughout the full implementation lifecycle. Key responsibilities include: - Delivering data warehouse solutions on the SQL Server platform. - Utilizing data modeling experience in either dimensional or data vault 2.0. - Working with automation tools to streamline processes. The qualifications required for this role are as follows: - Proficiency in both SQL and NoSQL database technologies (Oracle, SQL, etc.). - Strong understanding of Datawarehouse modeling (Kimball, relational, or similar) and programming languages like Python. - Experience with Datawarehouse, MSBI technologies, and related frameworks (SSIS/SSRS/PowerBI, etc.). - Working knowledge of PowerBI and the Microsoft SQL Server platform. - Familiarity with Azure and Cloud-based data platforms. - Ability to develop components, utilize versioning/Git tools, and work with backend services/API for data transfer and integration. Additionally, Socit Gnrale offers a stimulating and caring environment where you can grow, feel useful on a daily basis, and develop your expertise. As an employee, you can dedicate several days per year to solidarity actions during working hours, contributing to various charitable causes and initiatives. Furthermore, the company is committed to supporting the acceleration of its ESG strategy by implementing ESG principles across all activities and policies. This includes ESG assessment, reporting, project management, IT activities, and responsible practices for environmental protection. Join us at Socit Gnrale and be part of a team dedicated to creating positive change and shaping the world of tomorrow through innovation and action.,

Posted 14 hours ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an experienced ETL Developer at our company, your role will involve understanding Business Unit requirements and developing ETL pipelines using Informatica. Your responsibilities will include: - Gathering requirements from stakeholders and seeking clarification down to the smallest detail. - Planning, executing, and developing ETL scripts using Informatica. - Highlighting and escalating risks or concerns related to assigned tasks to your immediate supervisor. - Conducting unit testing of ETL processes to ensure the quality of work output. - Supporting project delivery teams in implementing data management processes. - Identifying and resolving data quality issues such as uniqueness, integrity, accuracy, consistency, and completeness in a timely and cost-effective manner. - Providing production support and handling escalations on a rotational basis. Qualifications required for this role include: - BE/B Tech/MSc/MCA with a specialization in Computer Science/Information Systems. - Minimum of 6 years of experience in Informatica Data Integration Tool. - Minimum of 6 years of experience in writing SQL Queries and working with Oracle databases. - Minimum of 3 years of experience in Python scripting. - Exposure to scheduling tools such as Control-M/Autosys. - Familiarity with Data Quality Processes or Informatica tool components like IDQ Informatica Data Quality (Developer & Analyst tool). - Strong communication skills and a proven track record of working effectively in team environments. - Self-starter with the ability to prioritize and manage a complex workload. - Experience in interpersonal and relationship management with strong organizational skills. - Capacity to acquire in-depth knowledge of the relevant business area. - Ability to work collaboratively as part of a team. - Proficiency in following both SDLC life cycle and Agile Development life cycle based on project requirements. You will not be required to travel for this role, and the work schedule is a mid-shift from 2 PM to 11 PM.,

Posted 2 days ago

Apply

8.0 - 12.0 years

16 - 30 Lacs

noida, hyderabad, pune

Work from Office

Solid understanding of GCP ETL framework. Data lake and Datawarehouse . Strong Bigquery SQL and Pyspark/Python Solid knowledge about Bigquery architecture (Mainly Dataproc, dataflow, Fusion, Cloud Composer, pub sub, Cloud Function, kubernetes).

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

Role Overview: At EY, you will have the opportunity to shape a unique career with the global scale, support, inclusive culture, and technology to help you become the best version of yourself. Your voice and perspective are valued to contribute to making EY even better. Join EY to create an exceptional experience for yourself and contribute to building a better working world for all. Key Responsibilities: - Utilize tools and techniques to analyze data collection, updates, storage, and exchange - Define and apply data modeling and design standards, tools, best practices, and development methodologies - Design, review, and maintain data models - Perform data analysis to capture data requirements and visualize them in data models - Manage the data model lifecycle from requirements to design, implementation, and maintenance - Collaborate with data engineers to create optimal physical data models - Identify opportunities to leverage data for enhancing business activities Qualification Required: - Bachelor's degree in Computer Science or equivalent with 3-7 years of industry experience - Experience in Agile-based delivery methodology is preferable - Strong analytical skills with a proactive problem-solving approach - Proficiency in Software Development Best Practices - Excellent debugging and optimization skills - Experience in Enterprise-grade solution implementations and converting business challenges into technical solutions - Strong communication skills, both written and verbal, formal and informal - Participation in all phases of the solution delivery life cycle, including analysis, design, development, testing, deployment, and support - Client management skills Additional Details: EY aims to build a better working world by creating long-term value for clients, people, and society while fostering trust in the capital markets. Through data and technology, diverse EY teams worldwide offer assurance and support clients in growth, transformation, and operations across various sectors. Working in assurance, consulting, law, strategy, tax, and transactions, EY teams tackle complex global challenges by asking better questions to find innovative solutions.,

Posted 4 days ago

Apply

8.0 - 13.0 years

13 - 23 Lacs

kolkata, hyderabad, pune

Work from Office

Primary skill - Google BigQuery Experience:: 8+ Yrs GCP Senior Data Engineer

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an experienced ETL Developer at our company, your role involves understanding Business Unit requirements and developing ETL pipelines using Informatica. You will be responsible for planning, executing, and testing ETL scripts, as well as supporting project delivery teams in implementing data management processes. Your attention to detail and ability to identify and resolve data quality issues will be crucial in ensuring the effectiveness and efficiency of our data processes. Additionally, you will be involved in providing production support on a rotational basis. Key Responsibilities: - Gather requirements from stakeholders and seek clarification down to the smallest detail. - Develop ETL scripts using Informatica. - Highlight and escalate risks or concerns to your immediate supervisor when necessary. - Conduct unit testing of ETL processes to maintain quality standards. - Support project delivery teams in implementing data management processes. - Identify and resolve data quality issues in a timely and cost-effective manner. - Provide production support on a rotational basis. Qualifications Required: - Bachelor's degree in Computer Science/Information Systems (BE/B Tech/MSc/MCA). - Minimum of 6 years of experience in Informatica Data Integration Tool. - Minimum of 6 years of experience in writing SQL Queries for Oracle database. - Minimum of 3 years of experience in Python scripting. - Exposure to scheduling tools such as Control-M or Autosys. - Familiarity with Data Quality processes or Informatica tool components like IDQ. - Strong communication skills and a proven track record of collaborating effectively within teams. - Ability to prioritize and manage a complex workload. - Experience with both SDLC and Agile Development methodologies. - Strong interpersonal and relationship management skills, along with excellent organizational abilities. - Capacity to gain a deep understanding of the relevant business area. Please note that this position does not require any travel and follows a mid-shift schedule from 2 PM to 11 PM.,

Posted 5 days ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As a Data Ops Capability Deployment Analyst at Citi, you will be a seasoned professional contributing to the development of new solutions and techniques for the Enterprise Data function. Your role involves performing data analytics and analysis across various asset classes, as well as building data science capabilities within the team. You will collaborate closely with the wider Enterprise Data team to deliver on business priorities. Working within the B & I Data Capabilities team, you will be responsible for managing the Data quality/Metrics/Controls program and implementing improved data governance and management practices. This program focuses on enhancing Citis approach to data risk and meeting regulatory commitments in this area. Key Responsibilities: - Hands-on experience with data engineering and a strong understanding of Distributed Data platforms and Cloud services. - Knowledge of data architecture and integration with enterprise applications. - Research and assess new data technologies and self-service data platforms. - Collaboration with Enterprise Architecture Team on refining overall data strategy. - Addressing performance bottlenecks, designing batch orchestrations, and delivering Reporting capabilities. - Performing complex data analytics on large datasets including data cleansing, transformation, joins, and aggregation. - Building analytics dashboards and data science capabilities for Enterprise Data platforms. - Communicating findings and proposing solutions to stakeholders. - Translating business requirements into technical design documents. - Collaboration with cross-functional teams for testing and implementation. - Understanding of banking industry requirements. - Other duties and functions as assigned. Skills & Qualifications: - 10+ years of development experience in Financial Services or Finance IT. - Experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools. - Hands-on experience with ETL using PySpark, data ingestion, Spark optimization, and batch orchestration. - Proficiency in Hive, HDFS, Airflow, and job scheduling. - Strong programming skills in Python with data manipulation and analysis libraries. - Proficient in writing complex SQL/Stored Procedures. - Experience with DevOps tools, Jenkins/Lightspeed, Git, CoPilot. - Knowledge of BI visualization tools such as Tableau, PowerBI. - Implementation experience with Datalake/Datawarehouse for enterprise use cases. - Exposure to analytical tools and AI/ML is desired. Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science. In this role, you will be part of the Data Governance job family focusing on Data Governance Foundation. This is a full-time position at Citi, where you will utilize skills like Data Management, Internal Controls, Risk Management, and more to drive compliance and achieve business objectives. If you require a reasonable accommodation due to a disability to utilize search tools or apply for a career opportunity at Citi, please review the Accessibility at Citi guidelines. Additionally, you can refer to Citi's EEO Policy Statement and the Know Your Rights poster for more information.,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You have an exciting opportunity to join us as a Database Specialist - SQL Server. With 4-6 years of experience, you will play a crucial role in developing high-quality database solutions, optimizing stored procedures and functions using T-SQL, and enhancing SQL queries for performance improvements. Your expertise in SSIS and Datawarehouse will be vital for this role. As a Database Specialist, your responsibilities will include developing top-notch database solutions, implementing and optimizing T-SQL procedures and functions, and meeting ongoing business report requirements. You will be involved in researching necessary data, creating relevant reporting deliverables, analyzing SQL queries for performance enhancements, and crafting procedures and scripts for efficient data migration. Additionally, your proficiency in SSIS, Data warehouse, SQL job scheduling, and issue resolution will be essential for ensuring smooth operations and providing timely management reporting. If you are passionate about database management and possess the required experience, we encourage you to apply for this role and be part of our dynamic team.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Ops Capability Deployment Analyst at Citigroup, you will be a seasoned professional contributing to the development of new solutions, frameworks, and techniques for the Enterprise Data function. Your role will involve performing data analytics and analysis across different asset classes, as well as building data science and tooling capabilities within the team. You will work closely with the Enterprise Data team to deliver business priorities. The B & I Data Capabilities team manages the Data quality/Metrics/Controls program and implements improved data governance and data management practices. The Data quality program focuses on enhancing Citigroup's approach to data risk and meeting regulatory commitments. Key Responsibilities: - Hands-on experience with data engineering and distributed data platforms - Understanding of data architecture and integration with enterprise applications - Research and evaluate new data technologies and self-service data platforms - Collaborate with the Enterprise Architecture Team on defining data strategy - Perform complex data analytics on large datasets - Build analytics dashboards and data science capabilities - Communicate findings and propose solutions to stakeholders - Convert business requirements into technical design documents - Work with cross-functional teams for implementation and support - Demonstrate a good understanding of the banking industry - Perform other assigned duties Skills & Qualifications: - 10+ years of development experience in Financial Services or Finance IT - Experience with Data Quality/Data Tracing/Metadata Management Tools - ETL experience using PySpark on distributed platforms - Proficiency in Python, SQL, and BI visualization tools - Strong knowledge of Hive, HDFS, Airflow, and job scheduling - Experience in Data Lake/Data Warehouse implementation - Exposure to analytical tools and AI/ML is desired Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science If you are a person with a disability and require accommodation to use search tools or apply for a career opportunity, review Accessibility at Citi.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As a Data Ops Capability Deployment Analyst at Citigroup, you will be a seasoned professional applying your in-depth disciplinary knowledge to contribute to the development of new solutions, frameworks, and techniques for the Enterprise Data function. Your role will involve integrating subject matter expertise and industry knowledge within a defined area, requiring a thorough understanding of how different areas collectively integrate within the sub-function to contribute to the overall business objectives. Your primary responsibilities will include performing data analytics and data analysis across various asset classes, as well as building data science and tooling capabilities within the team. You will collaborate closely with the wider Enterprise Data team, particularly the front to back leads, to deliver on business priorities. Working within the B & I Data Capabilities team in the Enterprise Data function, you will manage the Data quality/Metrics/Controls program and implement improved data governance and data management practices across the region. The Data quality program focuses on enhancing Citigroup's approach to data risk and meeting regulatory commitments in this area. Key Responsibilities: - Utilize a data engineering background to work hands-on with Distributed Data platforms and Cloud services. - Demonstrate a sound understanding of data architecture and data integration with enterprise applications. - Research and evaluate new data technologies, data mesh architecture, and self-service data platforms. - Collaborate with the Enterprise Architecture Team to define and refine the overall data strategy. - Address performance bottlenecks, design batch orchestrations, and deliver Reporting capabilities. - Perform complex data analytics on large datasets, including data cleansing, transformation, joins, and aggregation. - Build analytics dashboards and data science capabilities for Enterprise Data platforms. - Communicate findings and propose solutions to various stakeholders. - Translate business and functional requirements into technical design documents. - Work closely with cross-functional teams to prepare handover documents and manage testing and implementation processes. - Demonstrate an understanding of how the development function integrates within the overall business/technology landscape. Skills & Qualifications: - 10+ years of active development background in Financial Services or Finance IT. - Experience with Data Quality, Data Tracing, Data Lineage, and Metadata Management Tools. - Hands-on experience with ETL using PySpark on distributed platforms, data ingestion, Spark optimization, resource utilization, and batch orchestration. - Proficiency in programming languages such as Python, with experience in data manipulation and analysis libraries. - Strong SQL skills and experience with DevOps tools like Jenkins/Lightspeed, Git, CoPilot. - Knowledge of BI visualization tools like Tableau, PowerBI. - Experience in implementing Datalake/Datawarehouse for enterprise use cases. - Exposure to analytical tools and AI/ML is desired. Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science. In this role, you will play a crucial part in driving compliance with applicable laws, rules, and regulations while safeguarding Citigroup, its clients, and assets. Your ability to assess risks and make informed business decisions will be essential in maintaining the firm's reputation. Please refer to the full Job Description for more details on the skills, qualifications, and responsibilities associated with this position.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

vijayawada, andhra pradesh

On-site

You should have at least 4+ years of experience in Analysis, Design, and Development of OBIEE 11g and 12c based reporting solutions. Your expertise should include hands-on experience in rpd modeling, OBIEE Answers, Dashboards, and iBots. It is essential that you are an expert in OBIEE Administration and configuring integrated security set up. Additionally, you should possess a strong knowledge of PLSQL and Query language. A good understanding of Datawarehouse and ETL concepts would be beneficial for this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior Azure Data Engineer based in Chennai, India, you will play a crucial role in designing and implementing data pipelines using Azure Synapse to integrate data from various sources and file formats into SQL Server databases. Your responsibilities will include developing batch and real-time data pipelines to facilitate the transfer of data to data warehouses and data lakes. Collaborating with the Data Architect, you will work on new Data projects by constructing Data pipelines and managing Master Data. Your expertise in data analysis, extraction, cleansing, column-mapping, data transformations, and data modeling will be essential in meeting business requirements. You will be tasked with ensuring Data Availability on Azure SQL Datawarehouse by monitoring and troubleshooting Data pipelines effectively. To excel in this role, you must have a minimum of 3 years of experience in designing and developing ETL Pipelines using Azure Synapse or Azure Data Factory. Proficiency in Azure services such as ADLS2, Databricks, Azure SQL, and Logic Apps is required. Your strong implementation skills in Pyspark and Advanced SQL will be instrumental in achieving efficient Data transformations. Experience in handling structured, semi-structured, and unstructured data formats is a must, along with a clear understanding of Data warehouse, Data lake modeling, and ETL performance optimization. Additional skills that would be beneficial include working knowledge of consuming APIs in ETL pipelines, familiarity with PowerBI, and experience in Manufacturing Data Analytics & Reporting. A degree in information technology, Computer Science, or related disciplines is preferred. Join our global, inclusive, and diverse team dedicated to enhancing the quality of life through innovative motion systems. At our company, we value diversity, knowledge, skills, creativity, and talents that each employee brings. We are committed to fostering an inclusive, diverse, and equitable workplace where employees feel respected and valued, irrespective of their background. Our goal is to inspire our employees to grow, take ownership, and derive fulfillment and meaning from their work.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Cognos Migration Specialist at NTT DATA in Hyderabad, Telangana, India, you will be part of a dynamic team that values exceptional, innovative, and passionate individuals. We are committed to fostering an inclusive and forward-thinking work environment where you can thrive and grow professionally. To excel in this role, you should have at least 5 years of development experience in Visualization tools such as Cognos, Tableau, or Power BI, as well as a solid background in the Datawarehouse field. Your responsibilities will include leveraging your expertise in Business Intelligence and Data Analytics to drive insights and decision-making within the organization. A strong understanding of Data Warehouse concepts, including Slowly Changing Dimensions, Facts, and SCD1 and SCD2 implementations, is essential for success in this position. You will also be expected to utilize advanced SQL and programming languages to manipulate and analyze data effectively. Experience in migration projects involving visualization tools, particularly from Cognos to Power BI, will be considered a valuable asset. Your role will also involve debugging and back tracing issues, requiring strong analytical and problem-solving skills. Additionally, familiarity with Version control tools such as Azure and GIT, as well as Agile/Scrum methodologies, will be beneficial for collaborating effectively with the team and delivering high-quality solutions to our clients. Joining NTT DATA means becoming part of a trusted global innovator with a presence in over 50 countries. As a Global Top Employer, we are dedicated to helping clients innovate, optimize, and transform for long-term success. Our diverse team of experts collaborates with a robust partner ecosystem to deliver business and technology consulting, data and artificial intelligence solutions, industry-specific services, and application development and management. At NTT DATA, you will be at the forefront of digital and AI infrastructure, working towards a sustainable and confident digital future for organizations and society. If you are ready to contribute your skills and expertise to a company that values innovation and excellence, apply now and be a part of our exciting journey. Visit us at us.nttdata.com to learn more about our organization and the opportunities we offer.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a Data Integrator at our company, you will be expected to have 3 to 5 years of experience and hold a B.tech/BE degree. Your role will require excellent communication skills and the ability to thrive in a complex matrix environment. We value self-motivation and a strong team player mentality in our candidates. Your responsibilities will include demonstrating excellent development skills in PySpark, Python 2x or 3x, as well as in SQL/PL SQL (Oracle, Teradata/Sybase). Experience working in a Unix/Linux environment is essential, along with exposure to ElasticSearch, GPDB, and ORACLE. Knowledge of Datalake and Datawarehouse concepts is necessary, along with an understanding of Physical DataModel, Logical DataModel, and Conceptual DataModel. You will be tasked with creating Source to target Mapping, System Test Cases and Plans, as well as handling Code Versioning, Change Management, and Production Release Support. As the Single Point of Contact (SPOC) for Human Resources, you will play a crucial role in the company's operations. If you meet the requirements and are excited about this opportunity, please reach out to our Human Resources department at careers@tcgdigital.com.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be joining Tietoevry Create as a Snowflake Developer in Bengaluru, India. Your primary responsibility will be to design, implement, and maintain data solutions using Snowflake's cloud data platform. Working closely with cross-functional teams, you will deliver high-quality, scalable data solutions that drive business value. With over 7 years of experience, you should excel in designing and developing Datawarehouse & Data integration projects at the SSE / TL level. It is essential to have experience working in an Azure environment and be proficient in developing ETL pipelines using Python and Snowflake SnowSQL. Your expertise in writing SQL queries against Snowflake and understanding database design concepts such as Transactional, Datamart, and Data warehouse will be crucial. As a Snowflake data engineer, you will architect and implement large-scale data intelligence solutions around Snowflake Data Warehouse. Your role will involve loading data from diverse sets, translating complex requirements into detailed designs, and analyzing vast data stores to uncover insights. A strong background in architecting, designing, and operationalizing data & analytics solutions on Snowflake Cloud Data Warehouse is a must. Articulation skills are key, along with a willingness to adapt and learn new skills. Tietoevry values diversity, equity, and inclusion, encouraging applicants from all backgrounds to join the team. The company believes that diversity fosters innovation and creates an inspiring workplace. Openness, trust, and diversity are core values driving Tietoevry's mission to create digital futures that benefit businesses, societies, and humanity.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Architect at Diageo, you will play a crucial role in contributing to the transformation of our business capabilities through data and digital technology. You will be responsible for analyzing the overall IT landscape and various technologies to ensure seamless integration. Your expertise in data modeling, schema design, and data architectures will be essential in driving our enterprise data management, data warehouse, and business intelligence initiatives. You will have the opportunity to review data models for completeness, quality, and adherence to established architecture standards. Your strong capabilities in comparing and recommending tools and technologies will be instrumental in enhancing our data management processes. Additionally, your proficiency in metadata maintenance and data catalog management will contribute to the overall efficiency of our data systems. Preferred qualifications for this role include experience with Databricks Lakehouse architecture, expertise in working with file formats such as Parquet, ORC, AVRO, Delta, and Hudi, and exposure to CI/CD tools like Azure DevOps. Knowledge and experience with Azure data offerings will be beneficial in effectively leveraging our data resources. If you are passionate about leveraging data and technology to drive business growth and innovation, and if you thrive in a dynamic and collaborative environment, we invite you to join our team at Diageo. Your contributions will play a key role in shaping the future of our digital and technology initiatives.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a Technical Consultant specializing in Informatica and Oracle, you will be responsible for understanding complex technical and architectural issues, as well as the implications associated with the chosen technical strategy. Your role will involve interacting with various levels of the technical and business community to seek approval from all stakeholders involved in the projects. To qualify for this position, you should hold a B.E./B.Tech. or M.C.A. in Computer Science from a reputed university with a minimum of 10 years of relevant industry experience. Your expertise should include demonstrating technical leadership in Informatica, Oracle, Unix Scripting, Perl, and scheduling tools such as Autosys/Control. Additionally, you should possess a sound knowledge of Database Design, Data Warehouse, Data Mart, Enterprise Reporting, and ODS concepts. Your responsibilities will include but are not limited to: - Demonstrating strong Oracle PLSQL/T-SQL experience - Managing complete project lifecycle execution from requirements analysis to Go Live, with exposure to Agile methodology being a plus - Producing design/technical specifications and proposing solutions for new projects - Collaborating with delivery managers, System/Business Analysts, and other subject matter experts to design solutions - Working closely with business and technology teams to ensure proposed solutions meet requirements - Developing and implementing standards, procedures, and best practices for data management and optimization - Guiding and mentoring junior team members in solution building and troubleshooting - Utilizing your strong communication skills to effectively liaise with stakeholders - Having knowledge of Fund accounting, Fund reporting, and derivative concepts In addition to the above, you should have exposure to building reporting solutions with Web focus/OBIEE. Overall, your role as a Technical Consultant will require a deep understanding of data modeling, data normalization, and performance optimization techniques. If you meet the requirements and have a strong background in Solution Architect, Informatica, Oracle, DWH, PL/SQL, Technical Architect, Unix Scripting, Perl, Autosys, Control-M, Data Modeling, Data Normalization, Performance Optimization, OBIEE, Webfocus, Fund Account, then we encourage you to apply for this exciting opportunity in the IT/Computers-Software industry.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be working at a leading, global security authority that is known for disrupting its own category. The encryption provided by the company is trusted by major ecommerce brands, world's largest companies, major cloud providers, entire country financial systems, internets of things, and even surgically embedded pacemakers. The primary goal is to help companies establish digital trust in the real world. As a Senior Software Engineer, your role will involve collaborating with product managers, UX designers, and architects to understand project requirements and technical specifications. You will be responsible for designing, developing, testing, and maintaining software applications to ensure they meet high-quality standards and performance benchmarks. Writing clean, efficient, and maintainable code in alignment with coding standards and best practices will be a key aspect of your responsibilities. Additionally, you will conduct code reviews to enhance code quality, consistency, and alignment with the product design and architecture. Your tasks will also include analyzing, troubleshooting, and debugging product defects, providing timely solutions to customer issues, and staying updated on emerging technologies and industry trends to improve software development processes and tools continuously. You will contribute to architectural decisions and drive technical innovation within the team. To be eligible for this role, you should possess a Bachelor's degree in computer science, Software Engineering, or a related field, or equivalent experience. A minimum of 5 years of professional experience in a software development role is required. Proficiency in Java, Python, and a solid understanding of software development principles is essential. Experience in developing LLM-powered applications using AI frameworks like LangChain or LlamaIndex, as well as expertise in GenAI product development focusing on RAG techniques and AI agents, is preferred. Furthermore, familiarity with TensorFlow, PyTorch, LangChain, OpenAI APIs, NLP libraries, AI application development, data analytics, cloud computing platforms, MLOps, CI/CD pipelines, RESTful web services, CI/CD tools, databases, containerization with Docker, container orchestration using Kubernetes and Helm, software development methodologies like Agile or Scrum, secure coding practices, and unit testing is beneficial. Good communication and collaboration skills along with the ability to work effectively in cross-functional teams are essential. Having knowledge of PKI, Cryptography, and Code Signing would be an added advantage for this role. The company offers generous time-off policies, top-notch benefits, as well as education, wellness, and lifestyle support as part of the benefits package.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are seeking a Cognos Migration Specialist position at NTT DATA in Hyderabad, Telangana (IN-TG), India. As a Cognos Migration Specialist, you will utilize your 5+ years of development experience in Visualization tools such as Cognos, Tableau, or Power BI, and Datawarehouse field. You should possess good knowledge of Business Intelligence and Data Analytics and have a sound understanding of Data Warehouse concepts like Slowly changing dimensions, Facts, SCD 1, SCD2 implementations. Your role will involve utilizing advanced SQL and other programming languages to manipulate and analyze data. Experience in migration of visualization tools, especially from Cognos to Power BI, will be considered a plus. In this role, you will be expected to have good experience in debugging and back tracing issues, along with strong analytical and problem-solving skills. Knowledge of Version control tools like Azure and GIT, as well as Agile/Scrum methodologies, is desired. By joining NTT DATA, a $30 billion trusted global innovator of business and technology services, you will have the opportunity to work with diverse experts in more than 50 countries and a robust partner ecosystem. NTT DATA is committed to helping clients innovate, optimize, and transform for long-term success. As a part of NTT Group, which invests over $3.6 billion each year in R&D, NTT DATA aims to help organizations and society move confidently and sustainably into the digital future. If you are an exceptional, innovative, and passionate individual with a desire to grow, apply now to be part of our inclusive, adaptable, and forward-thinking organization. For more information, visit us at us.nttdata.com.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

At EY, you will have the opportunity to develop a career that is as unique as you are, supported by a global network, inclusive culture, and cutting-edge technology to help you reach your full potential. Your distinctive voice and perspective are essential in contributing to EY's continuous improvement. Join our team to create an exceptional experience for yourself and contribute to building a better working world for all. We are currently seeking a Data Modeller - Senior with a solid understanding of technology and data in the data modelling space, coupled with a proven track record in project delivery. This role offers an exciting opportunity to be part of a leading firm and a growing Data and Analytics team. Your responsibilities will include: - Utilizing tools and techniques for analyzing data collection, updating, storage, and exchange - Defining and implementing data modelling and design standards, tools, best practices, and development methodologies - Designing, reviewing, and maintaining data models - Performing data analysis to capture requirements and visualize them in data models - Managing the data model lifecycle from requirements to design, implementation, and maintenance - Collaborating with data engineers to create optimal physical data models - Identifying opportunities to leverage data for enhancing business activities Key Skills and Attributes: - 3-7 years of relevant experience - Proficiency in data modelling - Experience with data modelling tools such as Erwin Data Modeler, ER studio, Toad, etc. - Strong knowledge of SQL - Basic ETL skills for ensuring proper implementation of ETL processes - Good understanding of Datawarehouse concepts - Optional skills in data visualization - Knowledge of Data Quality (DQ) and data profiling techniques and tools To be considered for this role, you should: - Hold a degree in computer science or equivalent with 3-7 years of industry experience - Preferably have experience in Agile delivery methodology - Possess a proactive and flexible working style with a strong sense of ownership - Demonstrate strong analytical skills and problem-solving abilities - Exhibit proficiency in Software Development Best Practices - Have experience in implementing enterprise-grade solutions and translating business challenges into technical solutions - Be an excellent communicator, both in written and verbal communication - Participate in all stages of the solution delivery life cycle, including analysis, design, development, testing, deployment, and support - Have strong client management skills Join EY in building a better working world by creating long-term value for clients, people, and society while fostering trust in the capital markets. Our diverse teams across 150 countries leverage data and technology to provide assurance and help clients grow, transform, and thrive in today's complex landscape of assurance, consulting, law, strategy, tax, and transactions.,

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 - 0 Lacs

hyderabad

Work from Office

Role & responsibilities ETL Lead Developer- (8+ yrs-Need minimum 5 yrs Lead Exp) Mandatory Skills ETL - Datawarehouse concepts AWS, Glue SQL python SNOWFLAKE CI/CD Tools (Jenkins, GitHub), Azure, Datastage

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

pune

Hybrid

Job Title: Senior Data Engineer / Module Lead Location: Pune, Maharashtra, India Experience Level: 58 Years About the Role: We are seeking a highly skilled and experienced Senior Data Engineer to join our growing team in Pune. The ideal candidate will have a strong background in data engineering , with a particular focus on Google Cloud Platform (GCP) , Apache Airflow , and end-to-end ETL pipeline development . You will be responsible for designing, developing, and maintaining robust and scalable data pipelines, ensuring data quality, and optimizing data solutions for performance and cost. This role requires a hands-on approach and the ability to work both independently and collaboratively in an agile environment. Responsibilities: Design, develop, and deploy scalable ETL pipelines using GCP data services , including BigQuery , Cloud Composer (Apache Airflow) , and Cloud Storage . Develop, deploy, and manage complex DAGs in Apache Airflow for orchestrating data workflows. Write and optimize complex SQL and PL/SQL queries, stored procedures, and functions for data manipulation, transformation, and analysis. Optimize BigQuery workloads for performance, cost efficiency , and scalability . Develop scripts using Python and Shell scripting to support automation, data movement, and transformations. Ensure data quality , integrity , and reliability across all data solutions. Collaborate with cross-functional teams including data scientists, analysts, and engineers to understand data requirements and deliver effective solutions. Participate in code reviews and contribute to establishing and maintaining data engineering best practices . Troubleshoot and resolve data pipeline issues in a timely manner. Use version control systems (e.g., Git) for managing code and collaborating on engineering work. Stay updated on the latest trends and technologies in data engineering , cloud computing , and ETL processes . Required Skills and Qualifications: 5–8 years of hands-on experience in Data Engineering roles. Mandatory Skills: ETL pipeline design and implementation SQL and PL/SQL (complex queries, procedures, and transformations) Google Cloud Platform (GCP) — BigQuery, Cloud Composer, Cloud Storage Apache Airflow (designing and deploying complex DAGs) Python for scripting and data processing Shell Scripting for automation and orchestration tasks Experience with Informatica is a strong plus. Proven ability to optimize BigQuery for performance and cost. Familiarity with Git and version control best practices. Excellent problem-solving , analytical , and communication skills . Ability to thrive both independently and within a collaborative agile team . Bachelor’s degree in Computer Science , Engineering , or a related field. What We Offer: A challenging and rewarding role in a dynamic and fast-paced environment. Opportunity to work with cutting-edge technologies on the Google Cloud Platform . A collaborative and supportive team culture. Continuous learning and professional growth opportunities.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. We are counting on your unique voice and perspective to help EY become even better. Join us to build an exceptional experience for yourself and contribute to creating a better working world for all. EY's GDS Tax Technology team's mission is to develop, implement, and integrate technology solutions that better serve our clients and engagement teams. As a member of EY's core Tax practice, you'll develop a deep tax technical knowledge along with outstanding database, data analytics, and programming skills. The ever-increasing regulations require tax departments to gather, organize, and analyze more data than ever before. EY's GDS Tax Technology team members work side-by-side with partners, clients, and tax technical subject matter experts to develop and incorporate technology solutions that enhance value, improve efficiencies, and enable clients with disruptive and market-leading tools supporting tax. We are currently seeking a Data Engineer - Staff to join our Tax Technology practice in India. The key responsibilities include strong database programming/backend development experience using SQL Server/Azure SQL, writing complex queries, stored procedures, views, triggers, cursors, and UDFs. You should have strong verbal and written communication skills and between 1.5 and 3 years of experience in SQL/Azure SQL Database Server programming and Azure Data Factory. Experience with the Azure Data Platform (Azure Data factory/ Databricks) is an added advantage. The ability to effectively communicate with other team members and stakeholders is essential. The qualification required includes strong verbal and written communication skills, the ability to work as an individual contributor, and experience on Azure Data Factory V2, Azure Synapse, or Azure Databricks. EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations and offer a wide variety of fulfilling career opportunities. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. Continuous learning, success as defined by you, transformative leadership, and a diverse and inclusive culture are some of the benefits you can expect at EY. Join us in building a better working world.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Ops Capability Deployment Analyst at our organization, you will be a seasoned professional contributing to the development of new solutions, frameworks, and techniques while improving processes and workflow for the Enterprise Data function. Your role will involve integrating subject matter and industry expertise within a defined area, requiring an in-depth understanding of how different areas collectively integrate within the sub-function to contribute to the overall business objectives. Your primary responsibility will be to perform data analytics and analysis across various asset classes, as well as to establish data science and tooling capabilities within the team. You will collaborate closely with the wider Enterprise Data team, particularly the front-to-back leads, to deliver on business priorities effectively. Joining the B & I Data Capabilities team within the Enterprise Data, you will be involved in managing the Data quality/Metrics/Controls program and implementing improved data governance and data management practices throughout the region. The focus of the Data quality program will be on enhancing our approach to data risk and meeting regulatory commitments in this area. Key Responsibilities: - Utilize data engineering background and expertise in Distributed Data platforms and Cloud services. - Demonstrate a sound understanding of data architecture and integration with enterprise applications. - Research and assess new data technologies, data mesh architecture, and self-service data platforms. - Collaborate with the Enterprise Architecture Team to define and refine the overall data strategy. - Address performance bottlenecks, design batch orchestrations, and deliver Reporting capabilities. - Conduct complex data analytics on large datasets including data cleansing, transformation, joins, and aggregation. - Develop analytics dashboards and data science capabilities for Enterprise Data platforms. - Communicate findings and propose solutions to stakeholders effectively. - Translate business and functional requirements into technical design documents. - Collaborate with cross-functional teams such as Business Analysis, Product Assurance, Platforms and Infrastructure, Business Office, Control and Production Support. - Prepare handover documents and manage SIT, UAT, and Implementation processes. - Demonstrate a deep understanding of how the development function integrates within the overall business/technology to achieve objectives. - Perform other assigned duties as necessary. Skills & Qualifications: - 10+ years of active development background in Financial Services or Finance IT. - Experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools. - Hands-on experience with ETL using PySpark on distributed platforms, data ingestion, Spark optimization, and batch orchestration. - Proficiency in Hive, HDFS, Airflow, and job scheduler. - Strong programming skills in Python with experience in data manipulation and analysis libraries (Pandas, Numpy). - Ability to write complex SQL/Stored Procs. - Experience with DevOps, Jenkins/Lightspeed, Git, CoPilot. - Proficient in one or more BI visualization tools such as Tableau, PowerBI. - Proven experience in implementing Datalake/Datawarehouse for enterprise use cases. - Exposure to analytical tools and AI/ML is desired. Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science. If you are looking for a challenging opportunity where you can utilize your expertise in data analytics, data engineering, and data science, this role offers a dynamic environment where you can contribute to the growth and success of the Enterprise Data function within our organization.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be responsible for understanding the business unit requirements and developing ETL pipelines accordingly using Informatica. Your role will involve gathering requirements from stakeholders, seeking clarifications down to the smallest detail, and planning and executing ETL scripts. You will need to highlight any risks or concerns related to assigned tasks and escalate them to your immediate supervisor when necessary. Additionally, performing unit testing on the ETL processes to ensure the quality of work output is also part of your responsibilities. Supporting project delivery teams in adopting and executing data management processes will be another key aspect of your role. You will be expected to identify and address data quality issues such as uniqueness, integrity, accuracy, consistency, and completeness in a cost-effective and timely manner. Additionally, you will rotate in the production support role for any escalations that may arise. To qualify for this position, you should hold a BE/B Tech/MSc/MCA degree with a specialization in Computer Science/Information Systems. You are required to have approximately 6 years of experience in Informatica Data Integration Tool, 6 years of experience in writing SQL queries for Oracle databases, and 3 years of experience in Python scripting. Exposure to scheduling tools such as Control-M or Autosys is preferred, as well as experience with Data Quality Processes or Informatica tool components like IDQ Informatica Data Quality. Strong communication skills, a proven track record of working collaboratively in teams, and the ability to prioritize and manage a complex workload are essential for this role. You should also have experience in interpersonal and relationship management, possess strong organizational skills, and demonstrate the capacity to gain a thorough understanding of the relevant business area. Being able to work effectively as part of a team and following either the SDLC life cycle or Agile Development life cycle as required are also important. This position does not involve any travel requirements and follows a mid-shift schedule from 2 PM to 11 PM.,

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies