Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
karnataka
On-site
Your Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL). Collaborate with solution teams and Data Architects to implement data strategies, build data flows, and develop logical/physical data models. Work with Data Architects to define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Engage in hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Proactively and independently address project requirements and articulate issues/challenges to reduce project delivery risks. Your Profile Bachelor's degree in computer/data science technical or related experience. Possess 7+ years of hands-on relational, dimensional, and/or analytic experience utilizing RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols. Demonstrated experience with data warehouse, Data Lake, and enterprise big data platforms in multi-data-center contexts. Proficient in metadata management, data modeling, and related tools (e.g., Erwin, ER Studio). Preferred experience with services in Azure/Azure Databricks (Azure Data Factory, Azure Data Lake Storage, Azure Synapse & Azure Databricks) and working on SAP Datasphere is a plus. Experience in team management, communication, and presentation. Understanding of agile delivery methodology and experience working in a scrum environment. Ability to translate business needs into data vault and dimensional data models supporting long-term solutions. Collaborate with the Application Development team to implement data strategies, create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Maintain logical and physical data models along with corresponding metadata. Develop best practices for standard naming conventions and coding practices to ensure data model consistency. Recommend opportunities for data model reuse in new environments. Perform reverse engineering of physical data models from databases and SQL scripts. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Develop data models according to company standards. Guide System Analysts, Engineers, Programmers, and others on project limitations and capabilities, performance requirements, and interfaces. Review modifications to existing data models to improve efficiency and performance. Examine new application design and recommend corrections as needed. #IncludingYou Diversity, equity, inclusion, and belonging are cornerstones of ADM's efforts to continue innovating, driving growth, and delivering outstanding performance. ADM is committed to attracting and retaining a diverse workforce and creating welcoming, inclusive work environments that enable every ADM colleague to feel comfortable, make meaningful contributions, and grow their career. ADM values the unique backgrounds and experiences that each person brings to the organization, understanding that diversity of perspectives makes us stronger together. For more information regarding ADM's efforts to advance Diversity, Equity, Inclusion & Belonging, please visit the website: Diversity, Equity and Inclusion | ADM. About ADM At ADM, the power of nature is unlocked to provide access to nutrition worldwide. With industry-advancing innovations, a comprehensive portfolio of ingredients and solutions catering to diverse tastes, and a commitment to sustainability, ADM offers customers an edge in addressing nutritional challenges. As a global leader in human and animal nutrition and the premier agricultural origination and processing company worldwide, ADM's capabilities in insights, facilities, and logistical expertise are unparalleled. From ideation to solution, ADM enriches the quality of life globally. Learn more at www.adm.com.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
dehradun, uttarakhand
On-site
You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,
Posted 1 day ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer. About the Role In this opportunity as Data Engineer, you will: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer Innovate with new approaches to meeting data management requirements Effectively communicate and liaise with other data management teams embedded across the organization and data consumers in data science and business analytics teams. Analyze existing data pipelines and assist in enhancing and re-engineering the pipelines as per business requirements. Bachelors degree or equivalent required, Computer Science or related technical degree preferred About You Youre a fit for the role if your background includes: Mandatory skills Data Warehousing, data models, data processing[ Good to have], SQL, Power BI / Tableau, Snowflake [good to have] , Python 3.5 + years of relevant experience in Implementation of data warehouse and data management of data technologies for large scale organizations Experience in building and maintaining optimized and highly available data pipelines that facilitate deeper analysis and reporting Worked on Analyzing data pipelines Knowledgeable about Data Warehousing, including data models and data processing Broad understanding of the technologies used to build and operate data and analytic systems Excellent critical thinking, communication, presentation, documentation, troubleshooting and collaborative problem-solving skills Beginner to intermediate Knowledge of AWS, Snowflake, Python Hands-on experience with programming and scripting languages Knowledge of and hands on experience with Data Vault 2.0 is a plus Also have experience in and comfort with some of the following skills/concepts: Good in writing SQL and performance tuning Data Integration tools lie DBT, Informatica, etc. Intermediate in programming language like Python/PySpark/Java/JavaScript AWS services and management, including Serverless, Container, Queueing and Monitoring services Consuming and building APIs. #LI-SM1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
delhi
On-site
You are a skilled and experienced Business Analyst with a strong knowledge and expertise in Property and Casualty (P&C) insurance. You possess a comprehensive understanding of P&C insurance principles, processes, and systems. Your main responsibilities will include collaborating with stakeholders to gather and analyze business requirements relevant to P&C insurance operations. You will conduct insightful research into P&C insurance products, markets, and competitors to identify enhancements and opportunities. Leveraging your extensive P&C insurance knowledge, you will design and develop impactful solutions for business challenges. You will partner with cross-functional teams to ensure the seamless implementation of developed business solutions. Additionally, you will provide in-depth expertise for the development and upgrading of P&C insurance systems and applications. Your role will also involve performing meticulous data analysis and validation to maintain system data accuracy, as well as creating and maintaining comprehensive documentation such as functional specifications, business process flows, and user manuals. To excel in this role, you are expected to have a deep understanding of P&C insurance principles, products, and methodologies. You should demonstrate proven expertise in PL/SQL with the ability to craft intricate queries for database data manipulation and analysis. Familiarity with dimensional data marts and their applications in data warehousing settings is essential. Proficiency in business intelligence tools for developing reports and dashboards specific to P&C insurance analytics is required. Your exceptional analytical skills will enable you to translate complex requirements into tangible functional specifications. Strong communication proficiency is necessary for efficient collaboration with both technical and non-technical parties. Your rigorous attention to detail emphasizes data integrity and precision. You should have the capacity to independently handle tasks within dynamic team settings, adhering to multiple priorities and deadlines. A bachelor's degree in Business Administration, Computer Science, or related disciplines is preferred, and notable experience within the P&C insurance industry will be advantageous. Your skills should include data validation, business intelligence tools, insurance knowledge, proficiency in the policy and claim lifecycle, data analysis, analytical skills, data modeling, PL/SQL expertise, BRD creation, communication skills, source to target mapping, data profiling, SQL knowledge, attention to detail, documentation abilities, analytics, property and casualty (P&C) insurance understanding, data vault knowledge, and reporting and dashboard development proficiency.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
jaipur, rajasthan
On-site
You are a Sr. Data Engineer with a strong background in building ELT pipelines and expertise in modern data engineering practices. You are experienced with Databricks and DBT, proficient in SQL and Python, and have a solid understanding of data warehousing methodologies such as Kimball or Data Vault. You are comfortable working with DevOps tools, particularly within AWS, Databricks, and GitLab. Your role involves collaborating with cross-functional teams to design, develop, and maintain scalable data infrastructure and pipelines using Databricks and DBT. Your responsibilities include designing, building, and maintaining scalable ELT pipelines for processing and transforming large datasets efficiently in Databricks. You will implement Kimball data warehousing methodologies or other multi-dimensional modeling approaches using DBT. Leveraging AWS, Databricks, and GitLab, you will implement CI/CD practices for data engineering workflows. Additionally, you will optimize SQL queries and database performance, monitor and fine-tune data pipelines and queries, and ensure compliance with data security, privacy, and governance standards. Key qualifications for this role include 6+ years of data engineering experience, hands-on experience with Databricks and DBT, proficiency in SQL and Python, experience with Kimball data warehousing or Data Vault methodologies, familiarity with DevOps tools and practices, strong problem-solving skills, and the ability to work in a fast-paced, agile environment. Preferred qualifications include experience with Apache Spark for large-scale data processing, familiarity with CI/CD pipelines for data engineering workflows, understanding of orchestration tools like Apache Airflow, and certifications in AWS, Databricks, or DBT. In return, you will receive benefits such as medical insurance for employees, spouse, and children, accidental life insurance, provident fund, paid vacation time, paid holidays, employee referral bonuses, reimbursement for high-speed internet at home, one-month free stay for employees moving from other cities, tax-free benefits, and other bonuses as determined by management.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are an experienced Data Modeller with specialized knowledge in designing and implementing data models for modern data platforms, specifically within the healthcare domain. Your expertise includes a deep understanding of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. Your role involves translating complex business requirements into efficient and scalable data models that support analytics and reporting needs. You will be responsible for designing and implementing logical and physical data models for Databricks Lakehouse implementations. Collaboration with business stakeholders, data architects, and data engineers is crucial to create data models that facilitate the migration from legacy systems to the Databricks platform while ensuring data integrity, performance, and compliance with healthcare industry standards. Key responsibilities include creating and maintaining data dictionaries, entity relationship diagrams, and model documentation. You will develop dimensional models, data vault models, and other relevant modeling approaches. Additionally, supporting the migration of data models, ensuring alignment with overall data architecture, and implementing data modeling best practices are essential aspects of your role. Your qualifications include extensive experience in data modeling for analytics and reporting systems, strong knowledge of dimensional modeling, data vault, and other methodologies. Proficiency in Databricks platform, Delta Lake architecture, healthcare data modeling, and industry standards is required. You should have experience in migrating data models from legacy systems, strong SQL skills, and understanding of data governance principles. Technical skills that you must possess include expertise in data modeling methodologies, Databricks platform, SQL, data definition languages, data warehousing concepts, ETL/ELT processes, performance tuning, metadata management, data cataloging, cloud platforms, big data technologies, and healthcare industry knowledge. Your knowledge should encompass healthcare data structures, terminology, coding systems, data standards, analytics use cases, regulatory requirements, clinical and operational data modeling challenges, and population health and value-based care data needs. Your educational background should include a Bachelor's degree in Computer Science, Information Systems, or a related field, with an advanced degree being preferred. Professional certifications in data modeling or related areas would be advantageous for this role.,
Posted 3 days ago
3.0 - 7.0 years
12 - 15 Lacs
Pune
Work from Office
About The Role : Job Title: Database Engineer - CF, AS LocationPune, India Corporate TitleAS Role Description DWS Technology is a global team of technology specialists, spread across multiple trading hubs and tech centres. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. Our India location is our most recent addition to our global network of tech centres and growing strongly. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create #GlobalHausbank solutions from our home market. DWS Corporate Function Technology DWS Corporate Function Technology team covers technology for corporate function like finance, risk, ALM, AFC, etc. This position is for the SIMS application specifically which is a financial data warehouse which provides KPIs for management, quarterly and annual reporting among other things. The application consists of an Oracle database and a Java web front-end. As a database Engineer, you will be responsible for maintaining, enhancing, and optimizing the application in collaboration with the engineering team and business. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You will support the application team by maintaining, enhancing, and optimizing an Oracle-based financial data warehouse application (SIMS) and well as a web-application with a MSSQL database backend (FBRAE) with a focus on delivering robust solutions to meet business needs. Your responsibilities include: Collaborating with business stakeholders to design and implement new features, primarily through database development (PL/SQL, T-SQL) Ensuring application stability by analyzing and resolving data-related inquiries from the business, performing performance tuning, and optimizing processes Maintaining and enhancing reporting data marts built on a Data Vault architecture Supporting the team in migrating the applications front-end to a modern ReactJS/Spring Boot technology stack, leveraging a Microservices-oriented architecture hosted on the Google Cloud Platform Your skills and experience Masters degree (or equivalent) in Computer Science, Business Information Technology, or a related field Demonstrated expertise in Oracle PL/SQL or MS T-SQL development, with significant professional experience working on relational databasesthis is a critical requirement for the role Strong analytical and problem-solving skills Familiarity with an ETL tool (e.g., Informatica) and/or a reporting tool (e.g., Cognos) is desirable Experience in one or more of the following areas is advantageousbatch programming, Java/JavaScript programming (including ReactJS), or Microservices architecture Fluency in written and spoken English How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.
Posted 3 days ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will be responsible for understanding business requirements, data mappings, create and maintain data models through different stages using data modeling tools and handing over the physical design/DDL scripts to the data engineers for implementations of the data models. Your role involves creating and maintaining data models, ensuring performance and quality or deliverables.Experience- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+Key Responsibilities:- Drive discussions with clients teams to understand business requirements, develop Data Models that fits in the requirements- Drive Discovery activities and design workshops with client and support design discussions- Create Data Modeling deliverables and getting sign-off - Develop the solution blueprint and scoping, do estimation for delivery project. Technical Experience:Must Have Skills: - 7+ year overall IT experience with 3+ years in Data Modeling- Data modeling experience in Dimensional Modeling/3-NF modeling/No-SQL DB modeling- Should have experience on at least one Cloud DB Design work- Conversant with Modern Data Platform- Work Experience on data transformation and analytic projects, Understanding of DWH.- Instrumental in DB design through all stages of Data Modeling- Experience in at least one leading Data Modeling Tools e.g. Erwin, ER Studio or equivalentGood to Have Skills: - Any of these add-on skills - Data Vault Modeling, Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration- Must be familiar with Data Architecture Principles.Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E. or B.Tech. must Qualification 15 years full time education
Posted 3 days ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Experience:- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+- Data Vault Modeling Experience - 2+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Document models, data definitions, and metadata. Technical Experience:Good to have Skills: - 7+ year overall IT experience, 3+ years in Data Modeling and 2+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google BigQuery).- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Hands-on experience in any Data Vault automation tool (e.g., VaultSpeed, WhereScape, biGENIUS-X, dbt, or similar).- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education
Posted 3 days ago
6.0 - 11.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Location - Bangalore Skills and Qualifications : At least 6+ years' relevant experience would generally be expected to find the skills required for this role. 6+ years of being a practitioner in data engineering or a related field. Strong programming skills in Python, with experience in data manipulation and analysis libraries (e.g., Pandas, NumPy, Dask). Proficiency in SQL and experience with relational databases (e.g., Sybase, DB2, Snowflake, PostgreSQL, SQL Server). Experience with data warehousing concepts and technologies (e.g., dimensional modeling, star schema, data vault modeling, Kimball methodology, Inmon methodology, data lake design). Familiarity with ETL/ELT processes and tools (e.g., Informatica PowerCenter, IBM DataStage, Ab Initio) and open-source frameworks for data transformation (e.g., Apache Spark, Apache Airflow). Experience with message queues and streaming platforms (e.g., Kafka, RabbitMQ). Experience with version control systems (e.g., Git). Experience using Jupyter notebooks for data exploration, analysis, and visualization. Excellent communication and collaboration skills. Ability to work independently and as part of a geographically distributed team. Nice to have Understanding of any cloud-based application development & Dev Ops. Understanding of business intelligence tools - Tableau,PowerBI Understanding of Trade Lifecycle / Financial markets. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Posted 1 week ago
3.0 - 5.0 years
5 - 8 Lacs
Pune
Work from Office
Educational Requirements Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional Requirements: As a Snowflake Data Vault developer, individual is responsible for designing, implementing, and managing Data Vault 2.0 models on Snowflake platform. Candidate should have at least 1 end to end Data Vault implementation experience. Below are the detailed skill requirements: Designing and building flexible and highly scalable Data Vault 1.0 and 2.0 models. Suggest optimization techniques in existing Data Vault models using ghost entries, bridge, PIT tables, reference tables, Satellite split / merge, identification of correct business key etc. Design and administer repeating design patterns for quick turn around Engage and collaborate with customers effectively to understand the Data Vault use cases and brief the technical team with technical specifications Working knowledge of Snowflake is desirable Working knowledge of DBT is desirable Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning, and modeling. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Knowledge of enterprise databases such as DB2/Oracle/PostgreSQL/MYSQL/SQL Server. Hands-on knowledge and experience with tools and techniques for analysis, data manipulation, and presentation (e.g. PL/SQL, PySpark, Hive, Impala, and other scripting tools). Experience with Software Development Lifecycle using the Agile methodology. Knowledge of agile methods (SAFe, Scrum, Kanban) and tools (Jira or Confluence). Expertise in conceptual modeling; ability to see the big picture and envision possible solutions. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. We're committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events, yoga challenges, or marathons. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 week ago
3.0 - 7.0 years
11 - 15 Lacs
Gurugram
Work from Office
Overview We are seeking an experienced Data Modeller with expertise in designing and implementing data models for modern data platforms. This role requires deep knowledge of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. The ideal candidate will have a proven track record of translating complex business requirements into efficient, scalable data models that support analytics and reporting needs. About the Role As a Data Modeller, you will be responsible for designing and implementing data models for our Databricks-based Modern Data Platform. You will work closely with business stakeholders, data architects, and data engineers to create logical and physical data models that support the migration from legacy systems to the Databricks Lakehouse architecture, ensuring data integrity, performance, and compliance with healthcare industry standards. Key Responsibilities Design and implement logical and physical data models for Databricks Lakehouse implementations Translate business requirements into efficient, scalable data models Create and maintain data dictionaries, entity relationship diagrams, and model documentation Develop dimensional models, data vault models, and other modeling approaches as appropriate Support the migration of data models from legacy systems to Databricks platform Collaborate with data architects to ensure alignment with overall data architecture Work with data engineers to implement and optimize data models Ensure data models comply with healthcare industry regulations and standards Implement data modeling best practices and standards Provide guidance on data modeling approaches and techniques Participate in data governance initiatives and data quality assessments Stay current with evolving data modeling techniques and industry trends Qualifications Extensive experience in data modeling for analytics and reporting systems Strong knowledge of dimensional modeling, data vault, and other modeling methodologies Experience with Databricks platform and Delta Lake architecture Expertise in healthcare data modeling and industry standards Experience migrating data models from legacy systems to modern platforms Strong SQL skills and experience with data definition languages Understanding of data governance principles and practices Experience with data modeling tools and technologies Knowledge of performance optimization techniques for data models Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications in data modeling or related areas Technical Skills Data modeling methodologies (dimensional, data vault, etc.) Databricks platform and Delta Lake SQL and data definition languages Data modeling tools (erwin, ER/Studio, etc.) Data warehousing concepts and principles ETL/ELT processes and data integration Performance tuning for data models Metadata management and data cataloging Cloud platforms (AWS, Azure, GCP) Big data technologies and distributed computing Healthcare Industry Knowledge Healthcare data structures and relationships Healthcare terminology and coding systems (ICD, CPT, SNOMED, etc.) Healthcare data standards (HL7, FHIR, etc.) Healthcare analytics use cases and requirements Optionally Healthcare regulatory requirements (HIPAA, HITECH, etc.) Clinical and operational data modeling challenges Population health and value-based care data needs Personal Attributes Strong analytical and problem-solving skills Excellent attention to detail and data quality focus Ability to translate complex business requirements into technical solutions Effective communication skills with both technical and non-technical stakeholders Collaborative approach to working with cross-functional teams Self-motivated with ability to work independently Continuous learner who stays current with industry trends What We Offer Opportunity to design data models for cutting-edge healthcare analytics Collaborative and innovative work environment Competitive compensation package Professional development opportunities Work with leading technologies in the data space This position requires a unique combination of data modeling expertise, technical knowledge, and healthcare industry understanding. The ideal candidate will have demonstrated success in designing efficient, scalable data models and a passion for creating data structures that enable powerful analytics and insights.
Posted 1 week ago
3.0 - 6.0 years
5 - 9 Lacs
Pune
Work from Office
Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini
Posted 1 week ago
8.0 - 12.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Title: Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Department: Technology Reports To: Data Analysis Chapter Lead Level: Associate Director About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The ISS Technology group is responsible for providing Technology solutions to the Investment Solutions & Services (ISS) business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The ISS Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated ISS Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching ISS strategy. About your role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across Fidelitys key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver Fidelitys cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 1 week ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Remote
Client is 5+years of experience in data modeling and a minimum of 3 years of proficiency using ER Studio.The ideal candidate will possess a deep understanding of data architecture, database design, and conceptual, logical, and physical data models.
Posted 2 weeks ago
5.0 - 7.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About the Opportunity Job TypeApplication 31 July 2025 Title Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Department Technology Location Bangalore (hybrid / flexible working permitted) Reports To Data Analysis Chapter Lead Level Associate Director About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The ISS Technology group is responsible for providing Technology solutions to the Investment Solutions & Services (ISS) business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The ISS Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated ISS Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching ISS strategy. About your role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across Fidelitys key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver Fidelitys cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Noida
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and industry standards, facilitating smooth data integration and accessibility across the organization. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies mainly data vault 2.0.- Good To Have Skills: Experience with data governance frameworks, snowflake warehouse/AWS- Strong understanding of relational and non-relational database systems.- Familiarity with data warehousing concepts and ETL processes.- Experience in using data modeling tools such as Erwin or IBM InfoSphere Data Architect. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at any location.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake, with SnowPro Core certification being a must-have. In at least one project, you have utilized DBT to deploy models in production. Furthermore, you have experience in configuring and deploying Airflow, integrating various operators in Airflow, especially DBT & Snowflake. Your capabilities also include designing build, release pipelines, and a solid understanding of the Azure DevOps Ecosystem. Proficiency in Python, particularly PySpark, allows you to write metadata-driven programs. You are well-versed in Data Vault (Raw, Business) and concepts such as Point In Time and Semantic Layer. In ambiguous situations, you demonstrate resilience and possess the ability to clearly articulate problems in a business-friendly manner. Documenting processes, managing artifacts, and evolving them over time are practices you believe in and adhere to diligently. Required Skills: data vault, dbt, python, snowflake, data, Azure Cloud, AWS, articulate, PySpark, concepts, Azure, Airflow, artifacts, Azure DevOps.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a PySpark Data Engineer, you must have a minimum of 2 years of experience in PySpark. Strong programming skills in Python, PySpark, and Scala are preferred. It is essential to have experience in designing and implementing CI/CD, Build Management, and Development strategies. Additionally, familiarity with SQL and SQL Analytical functions is required, along with participation in key business, architectural, and technical decisions. There is an opportunity for training in AWS cloud technology. In the role of a Python Developer, a minimum of 2 years of experience in Python/PySpark is necessary. Strong programming skills in Python, PySpark, and Scala are preferred. Experience in designing and implementing CI/CD, Build Management, and Development strategies is essential. Familiarity with SQL and SQL Analytical functions and participation in key business, architectural, and technical decisions are also required. There is a potential for training in AWS cloud technology. As a Senior Software Engineer at Capgemini, you should have over 3 years of experience in Scala with a strong project track record. Hands-on experience in Scala/Spark development and SQL writing skills on RDBMS (DB2) databases are crucial. Experience in working with different file formats like JSON, Parquet, AVRO, ORC, and XML is preferred. Previous involvement in a HDFS platform development project is necessary. Proficiency in data analysis, data profiling, and data lineage, along with strong oral and written communication skills, is required. Experience in Agile projects is a plus. For the position of Data Modeler, expertise in data structures, algorithms, calculus, linear algebra, machine learning, and modeling is essential. Knowledge of data warehousing concepts such as Star schema, snowflake, or data vault for data mart or data warehousing is required. Proficiency in using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models is necessary. Hands-on knowledge and experience with tools like PL/SQL, PySpark, Hive, Impala, and other scripting tools are preferred. Experience with Software Development Lifecycle using the Agile methodology is essential. Strong communication and stakeholder management skills are crucial for this role. In this role, you will design, develop, and optimize PL/SQL procedures, functions, triggers, and packages. You will also write efficient SQL queries, joins, and subqueries for data retrieval and manipulation. Additionally, you will develop and maintain database objects such as tables, views, indexes, and sequences. Optimizing query performance and troubleshooting database issues to improve efficiency are key responsibilities. Collaboration with application developers, business analysts, and system architects to understand database requirements is essential. Ensuring data integrity, consistency, and security within Oracle databases is also a crucial aspect of the role. Developing ETL processes and scripts for data migration and integration are part of the responsibilities. Documenting database structures, stored procedures, and coding best practices is required. Staying up-to-date with Oracle database technologies, best practices, and industry trends is essential for success in this role.,
Posted 3 weeks ago
6.0 - 10.0 years
11 - 19 Lacs
Noida
Hybrid
QA Automation Engineer As a Senior QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks : Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution : Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation : Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing : Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams : Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration : Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting : Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management : Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation : Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job QualificationsJob QualificationsRequirements and skills • At Least 6+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). • Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. • Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. • Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. • Performance Testing • Experience with version control systems like Git • Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. • Strong communication and collaboration skills. • Attention to detail and a passion for delivering high-quality solutions. • Ability to work in a fast-paced environment and manage multiple priorities. • Enthusiastic about learning new technologies and frameworks. Experience with the following tools and technologies are desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud Including Logic App, Azure Functions, ADF
Posted 3 weeks ago
8.0 - 13.0 years
16 - 22 Lacs
Pune
Hybrid
Data Architect / Modeler About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Groups partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VOIS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Location : Pune Working Persona : Hybrid Experience : 9 to 14 years Must-Have Skills: Proficiency in Data Vault 2.0 and dimensional modeling. Strong experience with data profiling and source-to-target mapping. Expertise in working with GCP BigQuery environments. Solid understanding of data governance principles. Strong SQL hands-on. Good-to-Have Skills: Experience with Telecom Domain and systems - CRM, billing, mediation, provisioning. Performance Tuning in BigQuery – Partitioning, clustering, cost optimization. Telecom Analytics Use Cases – Churn prediction, fraud detection, network optimization. Working knowledge of Power Designer/ Erwin (or similar tools) for Data Model creation and maintenance. Role purpose: The Data & Analytics (D&A) team leads the design and delivery of the overall UK Data Strategy on behalf of Vodafone UK. The team is the driving force in maximising incremental value from data through the design & delivery of ‘Valuable’, ‘Accessible’ & ‘Trusted’ data. Now is a really exciting time to join the team as we begin our data transformation journey to implement this agreed data strategy working across all of Vodafone UK. DATA MODELLING & ARCHITECTURE: Working as part of a team you will help to define, develop and maintain the Data Architecture blue prints for implementing TRUSTED DATA. You will be responsible for defining, developing and maintaining data models for the GCP single physical layer, data models, semantic layer, and data products. Working with multiple IT partners and domain architects to understand data in Vodafone IT systems and design, map and model GCP data integration, enabling value through business leading GCP capabilities, fit for the business, focusing on a “Create once, use many times” TRUSTED capability approach. Working with the Data Innovation Lead, you will ensure all Data Products have the required capabilities, and all attributes currently produced in multiple locations across disperate reports are created within the data models/semantic layer as a core entity capability, reusable for all downstream capability. Data Architects, Data Governance & Product Owner teams will work together to document tech debt & historical work arounds which require re-engineering in the Data Warehouse or source systems to enable better, trusted data in the data models to support data products, which will accordingly influence the data improvements roadmap. Core competencies, knowledge and experience : 1. Excellent technical, data and architectural understanding & ability to work with a wide variety of disciplines/platforms - Strong working knowledge of Data Warehousing concepts. An ability to technically understand core IT transactional and CRM platforms and interpret source system generated data into actionable warehouse data through to business data in Data Products 2. Demonstrable knowledge and experience in creating and maintaining data models in data & analytics environments 3. Ability to engage, and work with stakeholders across locations and business teams to shape the delivery of valuable outcomes 4. Experience of working in Agile and creating Value for the business through Cloud based data architecture solutions. Key accountabilities and decision ownership 1. Data Modelling: Defining, developing, maintaining and socialising good documented and accessible data models, information models, and semantic models which are fit for purpose to enable data capabilities for TRUSTED DATA 2. Supporting excellence across D&A by championing the implementation of the Data Strategy TRUSTED blueprints, policies and frameworks 3. Supporting the implementation and ongoing maintenance of roadmaps and opportunities for new data or data improvement. 4. Support Data By Design: Providing input of data inadequacies, modelling constraints and missing IT attributes/capabilities affected TRUSTED DATA VOIS Equal Opportunity Employer Commitment VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills!
Posted 1 month ago
10.0 - 15.0 years
25 - 40 Lacs
Noida, Pune, Delhi / NCR
Hybrid
Role & responsibilities As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs Preferred candidate profile 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry . Perks and benefits
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated on industry trends and best practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.-Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize data warehouse performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough