Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
25 - 32 Lacs
kochi, bengaluru
Work from Office
Data marts using AWS services (Redshift, Aurora, RDS), Data modelling (Star schema, Snowflake),Perf tuning (compression, materialised views), Architecting scalable data warehouses using AWS Redshift, Athena,Glue, Batch analytics using Kinesis, Lambda
Posted 22 hours ago
6.0 - 11.0 years
25 - 40 Lacs
hyderabad, chennai, bengaluru
Hybrid
Role & responsibilities 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team
Posted 3 days ago
5.0 - 9.0 years
10 - 20 Lacs
bengaluru
Hybrid
Position: Senior Software Developer-Analytics *** JOB DESCRIPTION *** Overview: The Senior Software Developer will work closely with product manager, Implementation Consultants (ICs) and clients to gather requirements to meet the data analysis need of a company or a client. They must have good collaboration skills. The Senior Software Developer will provide direction on analytics aspects to the team on various analytics related activities. Key Tasks & Responsibilities: Experienced in Qlik Sense Architecture design and good knowledge on load script implementation and best practices. Hands on experience in Qlik Sense development, dashboarding, data-modelling and reporting techniques. Experienced in data integration through extracting, transforming, and loading (ETL) data from various sources. Good at Data transformation, the creation of QVD files and set analysis. Data Modelling using Dimensional Modelling, Star schema and Snowflake schema. Strong SQL skills (SQL Server) to validate the Qlik sense dashboards and to work on internal applications. Knowledge on deploying of Qlik Sense application using Qlik Management Console (QMC) is a plus. Work with Implementation consultants (ICs), product manager and clients to gather the requirements. Configuration, migration, and support of Qlik Sense applications. Thoughtful implementation of Qlik Sense best practices for efficiency and re-usability. Research and utilize new technologies. Collaborate with the Software Quality Assurance (SQA) team to test the applications functionality. Ensure compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures. Manage multiple timelines and deliverables (for single or multiple clients) and managing client communications as assigned. Other duties as assigned. Position: Senior Software Developer-Analytics Education/Language: BTech / MTech / Master of Science degree in Computer Science and/or equivalent work experience Good verbal and written communication skills Professional Skills & Experience: Minimum of 3-5 years of experience in implementing end to end business intelligence using Qlik Sense. Thorough experience in Qlik Sense architecture, design, develop, test and deployment process. Thorough understanding of Qlik Sense best practices (re-usability, efficiency, optimization). Knowledge on Clinical Datasets and Standards is a plus. (eg: SDTM, CDISC (92,45,101), Q Format, Customized Data formats ..etc). Excellent understanding of relational database concepts, data modelling, and design. Excellent knowledge on writing SQL code and ETL procedures using MS-SQL Server. Strong Software Development Lifecycle experience (Agile methodology experience is a plus). Strong technical project management experience and team leadership skills including scope management, work planning and work delegation. Strong troubleshooting skills and use of defect/feature management systems. Proven ability to work independently and with technical team members (Startup environment experience is a plus). Good verbal and written communication skills. Strong analytical skills and strong decision-making capabilities. Technical Skills & Experience 3+ years of experience in Qlik Sense architecture and design 3+ years of experience in develop, test and deploy of Qlik Sense applications. 3+ years with SQL Server and ETL process. 3+ years with Data modelling (physical & logical). Experience with Performance tuning and best practices of Qlik Sense. Experience with Dimensional modelling, Star Schema and Snowflake Schema. Knowledge of clinical trial data and SDTM standards is a plus.
Posted 4 days ago
9.0 - 14.0 years
32 - 45 Lacs
hyderabad, chennai, bengaluru
Work from Office
Experience in SQL .ETL , data modeling ,data warehouse ,cloud platform share your resume to b.roshitha@tekgenieservices.com, 9063478483.
Posted 5 days ago
5.0 - 10.0 years
15 - 30 Lacs
pune, gurugram, bengaluru
Work from Office
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, SQL Queries, Snowpipe, Snowsql, Performance Tuning, Query Optimization, Stored procedures, DBT, Matillion and ETL. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Snowflake Utilities , Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities: Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications: Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.
Posted 5 days ago
6.0 - 10.0 years
8 - 13 Lacs
hyderabad
Work from Office
Job description Position Title: Data Engineer (Snowflake Lead) Experience: 7+ Years Shift Schedule: Rotational Shifts Location: Hyderabad Role Overview We are seeking an experienced Data Engineer with strong expertise in Snowflake to join the Snowflake Managed Services team. This role involves data platform development, enhancements, and production support across multiple clients. You will be responsible for ensuring stability, performance, and continuous improvement of Snowflake environments. Key Responsibilities Design, build, and optimize Snowflake data pipelines, data models, and transformations. Provide L2/L3 production support for Snowflake jobs, queries, and integrations. Troubleshoot job failures, resolve incidents, and perform root cause analysis (RCA). Monitor warehouses, tune queries, and optimize Snowflake performance and costs. Manage service requests such as user provisioning, access control, and role management. Create and maintain documentation, runbooks, and standard operating procedures. Required Skills & Experience 5+ years of hands-on experience in Snowflake development and support. Strong expertise in SQL, data modeling, and query performance tuning. Experience with ETL/ELT development and orchestration tools (e.g., Azure Data Factory). Familiarity with CI/CD pipelines and scripting (Python or PySpark). Strong troubleshooting and incident resolution skills. Preferred Skills SnowPro Core Certification. Experience with ticketing systems (ServiceNow, Jira). Hands-on experience with Azure cloud services. Knowledge of ITIL processes.
Posted 6 days ago
6.0 - 11.0 years
20 - 35 Lacs
hyderabad, chennai, bengaluru
Hybrid
Are you ready to make a difference in Data Space? Looking for immediate joiners - only candidates available to join in September 2025 are eligible to apply. Job Title: Data Modeller & Architect Location: Bengaluru, Chennai, Hyderabad What do we expect? 6-12 years of experience in Data Modelling. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team Contact Amirtha (HR - Aram Hiring) - WhatsApp your resume to 8122080023 / amirtha@aramhiring.com Who is our client: Our Client is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. They offer full stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. They are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Their purpose is to provide certainty to shape a better tomorrow.Our client operates with 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of their team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence.We are a Great Place to Work-Certified (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. Our client have been ranked among the Best and Fastest Growing analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Curious about the role? What your typical day would look like? As an Engineer and Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights.On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs External Skills And Expertise You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry. Kindly share your resume to amirtha@aramhiring.com / 8122080023
Posted 6 days ago
12.0 - 16.0 years
35 - 45 Lacs
hyderabad, chennai, bengaluru
Hybrid
Role - Data Architect - Data Modeling Exp - 12-16 Yrs Locs - Chennai, Hyderabad, Bengaluru, Delhi, Pune Position - Permanent FTE Client - Data Analytics Global Leader Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 6 days ago
6.0 - 10.0 years
25 - 35 Lacs
hyderabad, chennai, bengaluru
Hybrid
Role - Data Modeler/Senior Data Modeler Exp - 6 to 9 Yrs Locs - Chennai, Hyderabad, Bengaluru, Delhi, Pune Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 6 days ago
5.0 - 10.0 years
16 - 31 Lacs
gurugram, bengaluru
Hybrid
Role : Data Modeller Experience: 5-12 Years Location: Gurugram/Bangalore Notice Period: Immediate to 45 Days Your scope of work / key responsibilities: Build and maintain out of standards data models to report disparate data sets in a reliable, consistent and interpretable manner. Gather, distil, and harmonize data requirements and to design coherent Conceptual, logical and physical data models and associated physical feed formats to support these data flows. Articulate business requirements and build source-to-target mappings having complex ETL transformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysis and database design - Transactional and Dimensional data modelling. Normalize/ De-normalize data structures, introduce hierarchies and inheritance wherever required in existing/ new data models. Develop and implement data warehouse projects independently. Work with data consumers and data suppliers to understand detailed requirements, and to propose standardized data models. Contribute to improving the Data Management data models. Be an influencer to present and facilitate discussions to understand business requirements and develop dimension data models based on these capabilities and industry best practices. Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding of Master Data Management, Data Quality and Data Governance. Basic understanding of data visualization tools like SAS VA, Tableau Good understanding of implementing & architecting data solutions using the informatica, SQL server/Oracle Key Qualifications and experience: Extensive practical experience in Information Technology and software development projects of with at least 8+ years of experience in designing Operational data store & data warehouse. Extensive experience in any of Data Modelling tools Erwin/ SAP power designer etc. Strong understanding of ETL and data warehouse concepts processes and best practices. Proficient in Data Modelling including conceptual, logical, and physical data modelling for both OLTP and OLAP. Ability to write complex SQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination of solid business knowledge and technical expertise with strong communication skills. Demonstrate excellent analytical and logical thinking. Good verbal & written communication skills and Ability to work independently as well as in a team environment providing structure in ambiguous situation. Interested candidates can share their resume at divya@beanhr.com
Posted 1 week ago
7.0 - 12.0 years
35 - 60 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Data Modeller Permanent Chennai / Bangalore / Pune / Hyderabad / Delhi NCR Preferred candidate profile 7- 15+ years of experience in Data Modeling. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Required Skills: Strong experience in SQL (tables, Attributes, joins, queries), Strong DWH Skills, Rational/Dimensional/ER Modeling, Data Valut modeling, OLAP/OLTP, Facts and Dimensions, Normalisation (3nf), Data marts, Schemas, Keys, Dimension types, M:M relationship & Forward /Reverse Engineering etc. Good to Have: Data valut, Tool experience (Erwin, ER Studio, Visio, Power Designer) & Cloud etc. If Interested, kindly share me your updated profile along with below details. Full Name: Contact #: Email ID: Total experience: Key skills: Current Location: Relocation: Current Employer: CTC (Including Variable): ECTC: Notice Period: Holding any offer:
Posted 1 week ago
0.0 - 2.0 years
25 - 40 Lacs
pune
Work from Office
Our world is transforming, and PTC is leading the way.Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. Job Overview: We are looking for a motivated and detail-oriented Junior Data Engineer with at least 2 years of experience in building and maintaining data pipelines. The ideal candidate should have hands-on experience with DBT , Snowflake , and data modeling concepts. You will work closely with senior engineers and analysts to support data infrastructure and analytics initiatives. Key Responsibilities: Develop and maintain data transformation workflows using DBT . Build and optimize data models in Snowflake to support reporting and analytics. Write efficient and well-documented SQL queries for data extraction and transformation. Collaborate with data analysts and business teams to understand data needs and deliver solutions. Ensure data quality and consistency through testing and validation. Participate in code reviews and contribute to improving data engineering practices. Required Qualifications: Bachelor’s degree in Computer Science, Information Systems, or a related field. 2+ years of experience in a data engineering or data analytics role. Proficiency in DBT for data transformation and documentation. Experience working with Snowflake or other cloud data warehouses. Strong knowledge of SQL and data modeling techniques (e.g., star schema, normalization). Familiarity with version control tools like Git . Good communication and problem-solving skills. Nice to Have: Exposure to cloud platforms (AWS, GCP, or Azure). Experience with data orchestration tools like Airflow or Prefect. Familiarity with BI tools such as Looker, Tableau, or Power BI Understanding of CI/CD processes Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."
Posted 1 week ago
6.0 - 9.0 years
15 - 25 Lacs
kochi, thiruvananthapuram
Work from Office
Manadatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, SSIS, ADF , Lead experience , Good communication Notice period : immediate - 20 days
Posted 1 week ago
7.0 - 12.0 years
8 - 18 Lacs
pune, gurugram, bengaluru
Work from Office
Job Overview: We are seeking a highly skilled and detail-oriented Snowflake Data Engineer to join our data engineering team. The ideal candidate will design and implement scalable Data Warehouse solutions leveraging Snowflake, develop robust data pipelines, and ensure optimal performance of analytics workloads. This role requires deep expertise in Snowflake architecture, SQL, and modern ELT/ETL practices. Key Responsibilities: Design and implement scalable data warehouse solutions using Snowflake. Develop complex SQL queries for analytics, reporting, and data transformation. Build and maintain robust ELT/ETL pipelines using Snowflake-native capabilities and external tools like ADF or Airflow. Apply dimensional modeling techniques (Star/Snowflake schemas) to support BI and analytics. Optimize performance and manage compute costs through profiling and tuning. Manage Snowflake objects including tables, views, materialized views, streams, tasks, UDFs, and stored procedures. Ensure data quality, governance, and lineage across the data lifecycle. Collaborate with cross-functional teams to gather requirements and build scalable data solutions. Troubleshoot issues related to data ingestion, transformation, and query performance. Stay up-to-date with Snowflake features like Time Travel, Zero Copy Cloning, Search Optimization, and CDC via Streams/Tasks. Required Skills: Advanced SQL: Joins, CTEs, window functions, recursive queries, analytical functions. Snowflake Architecture: Virtual warehouses, micro-partitions, clustering, scaling, caching. Data Modeling: Star/Snowflake schemas, normalization/denormalization strategies. Performance Tuning: Query optimization, result caching, warehouse sizing. ETL/ELT Development: Using Snowflake-native features and orchestration tools like ADF or Airflow. Stored Procedures & Scripting: Experience in SQL and JavaScript-based procedures and functions within Snowflake. Preferred Experience & Tools: Azure Data Factory (ADF): Pipelines, triggers, linked services, and integration with ADLS Gen2. Programming Skills: Python or PySpark for data transformation and automation. Snowpipe & Streams/Tasks: Real-time ingestion and Change Data Capture (CDC). Data Security: Implementation of row-level security, masking, and encryption. DevOps Practices: Familiarity with Git, CI/CD pipelines, and Terraform (basic level). Bonus/Value-Add Skills: Strong analytical and problem-solving abilities. Excellent communication and documentation skills. Agile/Scrum methodology experience. Leadership or mentoring experience. Ability to manage priorities in fast-paced, multi-project environments.
Posted 1 week ago
4.0 - 9.0 years
3 - 7 Lacs
bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Must have technical skills: 4 years+ on Snowflake advanced SQL expertise 4 years+ on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, 4+ Years experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning etc, 3 years+ on Python advanced Python expertise 3 years+ on any cloud platform AWS preferred hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is bare minimum, 3 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 3+ years with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: Clear communication written and verbal communication, especially with time off, delays in delivery etc. Team Player Works in the team and works with the team, Enterprise Experience Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands of users.
Posted 1 week ago
4.0 - 7.0 years
10 - 18 Lacs
noida, pune, bangalore rural
Hybrid
Role & responsibilities Were looking for candidates with strong technology and data understanding in data modelling space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team Your key responsibilities Employ tools and techniques used to understand and analyze how to collect, update, store and exchange data Define and employ data modeling and design standards, tools, best practices, and related development methodologies. Designs, reviews and maintains data models Performs the data analysis activities to capture data requirements and represent them in data models visualization Manages the life cycle of the data model from requirements to design to implementation to maintenance Work closely with the data engineers to create optimal physical data models of datasets. Identify areas where data can be used to improve business activities Skills and attributes for success Experience 3 - 7 Data modelling (relevant Knowledge) 3 years and above Experience in data modeling data tools including but not limited to Erwin Data Modeler, ER studio, Toad etc Strong knowledge in SQL Basic ETL skills to ensure implementation meets the documented specifications for ETL processes including data translation/mapping and transformation Good Datawarehouse knowledge Optional Visualisation skills Knowledge in DQ and data profiling techniques and tools Interested candidate can apply on the below link - https://careers.ey.com/job-invite/1611905/ Regards Aakriti Jain
Posted 1 week ago
6.0 - 11.0 years
30 - 35 Lacs
pune
Work from Office
Role Description As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases Your key responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your skills and experience 10+ years of hands-on experience with SQL in relational databases SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Experience in GCP, Cloud Database Migration experience, hands-on with Postgres
Posted 2 weeks ago
8.0 - 10.0 years
9 - 13 Lacs
bengaluru
Work from Office
As a Database Engineer, you would be responsible for design, development and optimization of database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large-scale relational databases What well offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy: Your key responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading (ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your skills and experience 8-10 years of hands-on experience with SQL in relational databases SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Effective learning, problem-solving, decision-making capability Strong verbal and written communication skills Self-starter, proactive and excellent team player with ability to work well under pressure in a fast-paced environment and always with professionalism. Ability to be open minded, learn new technologies on the job, share information, and transfer knowledge. Enterprise technology knowledge and experience (e.g., application/data migration, architecture, infrastructure, data transfer methods (SFTP), application and database technologies) Good to have Ability to write high performant, heavily resilient queries in Oracle PostgreSQL MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms (Good to have) Experience in GCP, Cloud Database Migration experience, hands-on with Postgres(Good to have) Skills and Experience That Will Help You Excel: Bachelor of Science degree from an accredited college or university with a concentration in Computer Science or Software Engineering (or equivalent) Familiarity with Finance and specifically Asset Management industry, Alternative investments Positive attitude and a team player Proactive and ability to work independently Open to learn, adapt and solution with new technologies Defines and implements best practices, solutions and standards related to their area of expertise.
Posted 2 weeks ago
6.0 - 10.0 years
11 - 16 Lacs
mumbai, pune, bengaluru
Work from Office
The Team Fidelity's Workplace Investing Reporting and Analytics chapter is seeking a BI Engineer to play a key role in enhancing client experience for PSW reporting application by delivering new functionality on newer technology stack This person will provide technical leadership and oversight, implement architecture recommendations, work with technical partners, and assist developers and testers as needed. The role demands significant collaboration with members of various business and IT groups throughout the lifecycle of a typical project. Our engineering team is innovative, diverse, hardworking, and self-driven. We work in a very dynamic agile environment. The Expertise You Have 6-10 Years of experience as a Power BI Developer with a strong portfolio of built reports, dashboards, and visualizations. Proficient in Power BI Desktop, Power BI Service, DAX, Power Query, and custom visuals. Hands-on experience working with Snowflake data warehouse, including integration, data modeling, and SQL querying. Strong understanding of data modeling concepts (star schema, snowflake schema, etc.) and the ability to design scalable data models. Knowledge of ETL tool and reporting in general Hands on experience with Oracle, PL/SQL Conceptual understanding of embedding reports in portal environment Experience with large-scale data visualization, managing performance issues, and implementing best practices for BI reports. Strong problem-solving skills, attention to detail, and the ability to troubleshoot data or report issues.
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are searching for a Data Engineer with strong experience in Data Bricks and the ability to join immediately. The ideal candidate should have 4 to 6 years of experience and must work from office (WFO) in Chennai. Key Skills required for this role include proficiency in SQL, ETL Tools, ADF, ADB, and Reporting Tools. Key Requirements: - Expert-level knowledge in RDBMS (SQL Server) with the ability to write SQL queries, create and manage objects, and optimize DB/DWH operations. - Good understanding of Transactional and Dimensional Data Modelling, Star Schema, Facts/Dimensions, and Relationships. - Familiarity with ETL concepts and experience with tools like Azure Data Factory, Azure Databricks, and Airflow. - In-depth expertise in Azure Data Factory and Databricks, including building scalable data pipelines, orchestrating workflows, implementing dynamic and parameterized pipelines, and optimizing Spark-based data transformations for large-scale integrations. - Hands-on experience with Databricks Unity Catalog for centralized data governance, access control, auditing, and managing data assets securely across multiple workspaces. - Experience in at least one development lifecycle of an end-to-end ETL project involving the mentioned ETL tools. - Ability to write and review test cases, test code, and validate code. - Good understanding of SDLC practices like source control, version management, and utilization of Azure Devops and CI/CD practices. Preferred Skills: - Knowledge of Python is a bonus. - Knowledge of SSIS is a bonus. - Familiarity with Azure Devops, Source Control/Repos is advantageous for this role.,
Posted 2 weeks ago
6.0 - 8.0 years
8 - 18 Lacs
hyderabad, pune, chennai
Hybrid
Role & responsibilities Senior Data Modelling Preferred candidate profile Job Title: Data Engineer (Data Modelling / SQL / Data Warehousing) Experience: 6 8 Years Location: Chennai / Pune / Hyderabad --- Key Responsibilities: Design and implement data models (Star Schema, Snowflake Schema, Fact & Dimension tables) for analytics and reporting. Develop and optimize SQL queries, stored procedures, and ETL workflows for large-scale data processing. Build and maintain data warehousing solutions, ensuring scalability, performance, and reliability. Collaborate with BI, Data Science, and application teams to deliver optimized data solutions. Perform data quality checks, governance, and performance tuning for large datasets. --- Required Skills: Strong expertise in Data Modelling (Star Schema, Snowflake Schema, normalization/denormalization). Advanced SQL skills (query optimization, indexing, stored procedures). In-depth knowledge of Data Warehousing concepts & ETL processes. Experience with at least one ETL tool (Informatica, SSIS, Talend, DataStage, etc.). Exposure to cloud data platfoms. Good problem-solving, communication, and stakeholder management skills.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
lucknow, uttar pradesh
On-site
You should have strong experience in SQL development and query optimization, along with a good understanding of database indexing, partitioning, and performance tuning. Knowledge of data warehousing concepts such as Star Schema and Dimension Modeling is essential. Experience in creating dimension tables for analytics and reporting, as well as implementing data partitioning and performance tuning techniques in Data Warehousing (DWH) environments, will be beneficial. Additionally, familiarity with tools such as Snowflake, Talend, TeamCity, Jenkins, GIT, and ETL scripts is required. Experience working with database objects, job schedulers for execution and monitoring, branching, merging, and automated deployment for ETL processes will be advantageous. An added advantage would be excellent exposure to Cloud technologies.,
Posted 2 weeks ago
5.0 - 9.0 years
4 - 7 Lacs
gurugram
Work from Office
Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization
Posted 2 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
mumbai, pune, chennai
Work from Office
Senior-Level Data Engineer Bachelors with 6+ or Masters with 5+ experience in Computer Science, Engineering, Math or other quantitative field. 5+ Experience with developing Batch ETL/ELT processes using SQL Server and SSIS ensuring all related data pipelines meets best-in-class standards offering high performance. 5+ experience writing and optimizing SQL queries and stored Procedures for Data Processing and Data Analysis. 5+ years of Experience in designing and building complete data pipelines, moving and transforming data for ODS, Staging, Data Warehousing and Data Marts using SQL Server Integration Services (ETL) or other related technologies. 5 + years' experience Implementing Data Warehouse solutions (Star Schema, Snowflake schema) for reporting and analytical applications using SQL Server and SSIS ,or other related technologies. 5+ years' Experience with large-scale data processing and query optimization techniques using TSQL. 5+ years' Experience with implementing audit, balance and control mechanism in data solutions 3+ year experience with any source control repos like GIT, TFVC or Azure DevOps , including branching and merging and implement CICD pipelines for database and ETL workloads. 2+ experience working with Python Pandas libraries to process semi structured data sets, and load them to SQL Server DB.
Posted 2 weeks ago
6.0 - 9.0 years
8 - 11 Lacs
hyderabad
Work from Office
- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |