Jobs
Interviews

23 Advanced Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the opportunity to shape a career that is as unique as you are, benefiting from global support, an inclusive culture, and cutting-edge technology to empower you to reach your full potential. Your distinctive voice and perspective are crucial in contributing to EY's continuous improvement. Join us in creating an exceptional experience for yourself and in fostering a better working world for all. As a part of the EY-ER-Regulatory Compliance team, you will play a key role in understanding clients" business requirements and delivering solutions in alignment with EY guidelines and methodologies. In your role as a Regulatory Compliance Senior, you will actively cultivate and enhance both internal and external relationships. Upholding our commitment to quality, you will drive projects to successful completion with high-quality deliverables, enhance operational efficiency, identify and communicate risks to clients and EY senior management, and take the lead on internal initiatives. We are seeking an ETQ Developer who will be responsible for designing, developing, and maintaining various modules of the EtQ Reliance platform. This role involves implementing system configurations and customizations, utilizing Out-of-Box features, writing ETQ Scripts for complex configurations, and collaborating with cross-functional teams to gather requirements and ensure successful implementation of quality management systems in a regulated environment. Key Responsibilities: - Collaborate with stakeholders to gather requirements and define software functionalities. - Design and develop software applications on the EtQ Reliance platform in an Agile team setting. - Configure and customize the EtQ Reliance system to meet business needs. - Conduct unit testing to ensure software quality and performance. - Peer review code and configurations. - Create and maintain technical documentation, including system configurations and workflows. - Perform code promotions following the defined SDLC process. - Execute test scripts for code promotions. - Provide technical support and training to end-users. - Troubleshoot and resolve issues in the production environment. - Collaborate with technical leads and scrum masters to define project scope and deliverables. - Stay updated on the latest EtQ Reliance features and industry trends. Qualifications: - Bachelor's degree in Computer Science, Software Engineering, or a related field. - Proficiency in coding with Python and Java, Advanced SQL, and DBMS. - Strong knowledge of the EtQ Reliance platform and its modules. - Excellent problem-solving and analytical skills. - Previous experience as an EtQ Developer, including exposure to various configurations, customizations, system integrations, data migration, and automation in a regulated environment. - Ability to work independently, manage multiple priorities, and follow Agile methodology. - Strong communication and collaboration skills. Good to Have: - ETQ Reliance Promotion Certification. - Intermediate or Advanced ETQ Designer certification. EY is dedicated to building a better working world by creating long-term value for clients, people, and society while fostering trust in the capital markets. Our diverse teams, spread across 150 countries, leverage data and technology to provide assurance and support clients in growth, transformation, and operations across various sectors. With expertise in assurance, consulting, law, strategy, tax, and transactions, EY teams are committed to asking the right questions to address the complex challenges of today's world.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You should have good knowledge in System Replication server Architecture and a deep understanding of Sybase architecture, including installation, configuration, maintenance, and applying security patches. Your responsibilities will include DB Shrink, SRS configuration, and troubleshooting latency issues. Performance tuning will be a key aspect of your role. Additionally, you should have experience with high availability and disaster recovery planning and solutions. Knowledge of scripting languages for automation is essential. Advanced SQL knowledge is required, including stored procedures, triggers, and complex query optimization techniques. You should also be familiar with backups, restores, and recovery models. Skills required for this role include expertise in db shrink, complex query optimization, triggers, SQL, disaster recovery planning, installation, configuration, backups, restores, scripting languages, troubleshooting latency issues, performance tuning, Sybase SQL Anywhere, SRS configuration, high availability, security patches, Sybase architecture, stored procedures, recovery models, System Replication server architecture, maintenance, and advanced SQL.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Analytics Lead at Cummins Inc., you will be responsible for facilitating data, compliance, and environment governance processes for the assigned domain. Your role includes leading analytics projects to provide insights for the business, integrating data analysis findings into governance solutions, and ingesting key data into the data lake while ensuring the creation and maintenance of relevant metadata and data profiles. You will coach team members, business teams, and stakeholders to find necessary and relevant data, contribute to communities of practice promoting responsible analytics use, and develop the capability of peers and team members within the Analytics Ecosystem. Additionally, you will mentor and review the work of less experienced team members, integrate data from various source systems to build models for business use, and cleanse data to ensure accuracy and reduce redundancy. Your responsibilities will also involve leading the preparation of communications to leaders and stakeholders, designing and implementing data/statistical models, collaborating with stakeholders on analytics initiatives, and automating complex workflows and processes using tools like Power Automate and Power Apps. You will manage version control and collaboration using GITLAB, utilize SharePoint for project management and data collaboration, and provide regular updates on work progress via JIRA/Meets to stakeholders. Qualifications: - College, university, or equivalent degree in a relevant technical discipline, or relevant equivalent experience required. - This position may require licensing for compliance with export controls or sanctions regulations. Competencies: - Balancing stakeholders - Collaborating effectively - Communicating clearly and effectively - Customer focus - Managing ambiguity - Organizational savvy - Data Analytics - Data Mining - Data Modeling - Data Communication and Visualization - Data Literacy - Data Profiling - Data Quality - Project Management - Valuing differences Technical Skills: - Advanced Python - Databricks, Pyspark - Advanced SQL, ETL tools - Power Automate - Power Apps - SharePoint - GITLAB - Power BI - Jira - Mendix - Statistics Soft Skills: - Strong problem-solving and analytical abilities - Excellent communication and stakeholder management skills - Proven ability to lead a team - Strategic thinking - Advanced project management Experience: - Intermediate level of relevant work experience required - This is a Hybrid role Join Cummins Inc. and be part of a dynamic team where you can utilize your technical and soft skills to make a significant impact in the field of data analytics.,

Posted 2 days ago

Apply

6.0 - 10.0 years

0 - 3 Lacs

Pune, Chennai

Hybrid

Greetings, We have an opening for one of our clients " Mphasis " for " Scala Developer " position. Role & responsibilities Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. Minimum 5 years of professional experience in Data Engineering, with a strong focus on big data technologies. Proficiency in Scala for developing big data applications and transformations, especially with Apache Spark. Expert-level proficiency in SQL; ability to write complex queries, optimize performance, and understand database internals. Extensive hands-on experience with Apache Spark (Spark SQL, DataFrames, RDDs) for large-scale data processing and analytics. Solid understanding of distributed computing concepts and experience with the Hadoop ecosystem (HDFS, Hive). Experience with building and optimizing ETL/ELT processes and data warehousing concepts. Strong understanding of data modeling techniques (e.g., Star Schema, Snowflake Schema). Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an Agile team environment. Work mode : Hybrid ( 3 days ) Location : Pune / Chennai

Posted 3 days ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

Nagpur, Pune

Work from Office

Design and manage scalable, secure, and high-performance database systems aligned with business goals. Optimize performance, ensure data integrity, and implement modern data solutions. Lead cross-functional collaboration.

Posted 4 days ago

Apply

2.0 - 4.0 years

12 - 15 Lacs

Bengaluru

Hybrid

Role & responsibilities Responsibilities Manage end-to-end data pipelines, ensuring seamless flow and integrity of data from diverse sources to analytical systems Collaborate with data scientists, analysts, and business teams to understand data needs and develop efficient solutions. Implement robust data governance practices to maintain data quality standards and facilitate reliable analysis and reporting Conduct thorough data validation procedures to ensure accuracy and reliability of analytical outputs Monitor data systems and pipelines, troubleshoot issues, and ensure the continuous availability of data Ensure data quality, integrity, and consistency across different data sources and storage systems Optimize data flow and storage processes for performance and scalability. Preferred candidate profile Skills & Requirement Must Have... At least 2-4 years of experience working in the field of analytics, reporting out metrics and deep dive analytics. Strong proficiency with Advanced SQL (Window Functions, DML Commands, DDL Commands, CTES, Sub Queries, etc) Expertise in building end to end data pipelines and ETL frameworks & tools Ability to write complex queries and understanding of database concepts. Strong understanding of data modelling, schema design, and database optimization techniques Knowledge of version control (e.g., Git) and collaborative development practices. Exceptional communication and collaboration skills. Nice to have... Exposure to broader analytics ecosystem Experience with data lake architectures and big data technologies. Education Bachelors degree in computer science, Engineering, or a related field. At least 2-4 years of relevant experience in analytics organizations of large corporates or in consulting companies in analytics roles. Perks and benefits

Posted 4 days ago

Apply

3.0 - 5.0 years

12 - 15 Lacs

Bengaluru

Hybrid

Role & responsibilities Responsibilities Mine and analyse data to identify patterns and correlations among the various data points. Perform end-to-end analysis across all digital touchpoints, including data gathering from large and complex data sets, data processing, and analysis. Conduct in-depth analysis of user behaviour, customer journeys and other relevant metrics to understand the effectiveness of digital initiatives and identify areas for improvement. Present findings from analytics and research and make recommendations to business teams. Preferred candidate profile Skills & Requirement Must Have... 3-5 years of experience working in the field of analytics, reporting out metrics and deep dive analytics. Strong proficiency with Advanced SQL (Windows Functions, DML, DDL Commands, CTES, Sub Queries,etc) for data analysis and building end to end data pipelines. Ability to write complex queries and understanding of database concepts. Strong analytical problem-solving skills and an aptitude for learning quickly. Expert in data analysis and presentation skills. Exceptional communication and collaboration skills Critical Thinking and ability to think beyond the obvious Nice to have... Experience in web analytics and tools like (Adobe Omniture, Google analytics etc) Experience with programming languages like Python & Unix Shell for data pipeline automation and analysis. Knowledge of statistics concepts and Machine learning algorithms like regression, clustering etc. Education Bachelors with Post-Graduation in Management Science and related fields. 2-4 years of relevant experience in analytics organizations of large corporates or in consulting companies in analytics roles. Perks and benefits

Posted 4 days ago

Apply

6.0 - 8.0 years

8 - 18 Lacs

Pune

Hybrid

Job Title: Lead ETL Developer Job Location: Pune Job Description: Company Introduction Join Nitor Infotech, an Ascendion company, where we harness data to drive impactful solutions. Our innovative team is dedicated to excellence in data processing and analytics, making a significant difference in the retail domain. Be part of a collaborative environment that values your expertise and contributions. Job Overview We are seeking an ETL Developer with expertise in Advanced SQL, Python, and Shell Scripting. This full-time position reports to the Data Engineering Manager and is available in a hybrid work model. This is a replacement position within the SRAI - EYC Implementation team. Key Responsibilities Design and develop ETL processes for data extraction, transformation, and loading. Utilize Advanced SQL for data processing and analysis. Implement data processing solutions using Python and Shell Scripting. Collaborate with cross-functional teams to understand data requirements. Maintain and optimize data pipelines for performance and reliability. Provide insights and analysis to support business decisions. Ensure data quality and integrity throughout the ETL process. Stay updated on industry trends and best practices in data engineering. Must-Have Skills and Qualifications 7-8 years of experience as an ETL Developer. Expertise in Advanced SQL for data manipulation and analysis. Proficient in Python and Shell Scripting. Foundational understanding of Databricks and Power BI. Strong logical problem-solving skills. Experience in data processing and transformation. Understanding of the retail domain is a plus. Good-to-Have Skills and Qualifications Familiarity with cloud data platforms (AWS, Azure). Knowledge of data warehousing concepts. Experience with data visualization tools. Understanding of Agile methodologies. What We Offer Competitive salary and comprehensive benefits package. Opportunities for professional growth and advancement. Collaborative and innovative work environment. Flexible work arrangements. Impactful work that drives industry change. DEI Statement At Nitor Infotech, we embrace diversity and inclusion. We actively foster an environment where all voices are heard and valued. ISMS Statement Nitor Infotech maintains ISO 27001 certification. All employees must adhere to our information security policies.

Posted 4 days ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Databricks Engineer-Lead, you will be responsible for designing and developing ETL pipelines using Azure Data Factory for data ingestion and transformation. You will collaborate with various Azure stack modules such as Data Lakes and SQL Data Warehouse to create robust data solutions. Your role will involve writing efficient SQL, Python, and PySpark code for data processing and transformation. It is essential to understand and translate business requirements into technical designs, develop mapping documents, and adhere to transformation rules as per the project scope. Effective communication with stakeholders to ensure smooth project execution is a crucial aspect of this role. To excel in this position, you should possess 7-10 years of experience in data ingestion, data processing, and analytical pipelines involving big data and relational databases. Hands-on experience with Azure services like Azure Data Lake Storage, Azure Databricks, Azure Data Factory, Azure Synapse Analytics, and Azure SQL Database is required. Proficiency in SQL, Python, and PySpark for data manipulation is essential. Familiarity with DevOps practices and CI/CD deployments is a plus. Strong communication skills and attention to detail, especially in high-pressure situations, are highly valued in this role. Previous experience in the insurance or financial industry is preferred. This role is based in Hyderabad and requires the selected candidate to work from the office. If you are passionate about leveraging Databricks, PySpark, SQL, and other Azure technologies to drive innovative data solutions, this position offers an exciting opportunity to lead and contribute to impactful projects.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

This role is a part of the upcoming ValueMomentum Data Engineering Recruitment Drive on July 19th. As a Tech Lead-Modern Data, you will be responsible for leading data integration processes utilizing Informatica IICS. With 7-10 years of experience, you will be involved in designing, developing, and managing ETL/ELT processes. Your role will entail close collaboration with cross-functional teams to ensure that data solutions not only meet business needs but also align with industry best practices. Joining ValueMomentums Engineering Center means becoming part of a team of passionate engineers dedicated to addressing complex business challenges with innovative solutions. Our focus on transforming the P&C insurance value chain relies on a strong engineering foundation and a continuous refinement of processes, methodologies, tools, agile delivery teams, and core engineering archetypes. With expertise in Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise, we are committed to investing in your growth through our Infinity Program, empowering you to build your career with role-specific skill development using immersive learning platforms. As a Tech Lead, your responsibilities will include designing and implementing data integration processes using Informatica IICS, constructing mappings, tasks, task flows, schedules, and parameter files. You will ensure adherence to ETL/ELT best practices, create ETL mapping documentation, collaborate with stakeholders to understand data requirements, and implement solutions. Supporting activities such as ticket creation and resolution in Jira/ServiceNow, working in an Agile/DevOps environment, and ensuring timely delivery of solutions are key aspects of this role. To be successful in this position, you should have at least 7 years of experience in Informatica, with a minimum of 2 years in Informatica IICS. Strong experience in ETL tools and database designs, a good understanding of Agile methodologies, experience working in Onsite/Offshore models, as well as experience in the insurance or financial industry are preferred. Strong problem-solving and analytical skills, attention to detail in high-pressure situations, and excellent verbal and written communication skills are essential requirements. ValueMomentum is a leading solutions provider for the global property and casualty insurance industry. With a focus on helping insurers achieve sustained growth and high performance, the company enhances stakeholder value and fosters resilient societies. Having served over 100 insurers, ValueMomentum stands as one of the largest services providers exclusively dedicated to the insurance industry. At ValueMomentum, we offer a congenial environment for your professional growth, surrounded by experienced professionals. Some benefits available to you include a competitive compensation package, individual career development through coaching and mentoring programs, comprehensive training and certification programs, performance management tools like goal setting, continuous feedback, and year-end appraisal, as well as rewards and recognition for outstanding performers.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

The Associate Manager, Compliance Analytics supports the development and implementation of compliance analytics across the global compliance organization. You will be responsible for designing and developing analytical solutions, generating insights, and fostering a culture of ethics and integrity. Your role will involve transforming data from multiple systems, designing the analytics approach, and executing the analytics. Your responsibilities will include designing, developing, and maintaining analytics to provide business insights, enabling risk-intelligent decisions, informing compliance risk assessment processes, and supporting remediation activities. You will be involved in scoping, gathering requirements, developing analytics models, validating and testing models, and communicating results to stakeholders for analytics projects. Additionally, you will support the execution of analytics initiatives to enhance the Global Compliance program and provide actionable insights through data and analysis to achieve business objectives. Furthermore, you will contribute to the development of standardized analytics processes and frameworks, including documentation and validation of work. You will also be responsible for setting up sustainable analytics solutions, including data pipelines and automated refresh schedules as necessary. Collaboration with compliance officers, IT, and stakeholders to understand business objectives and provide reliable and accurate reports, insights, and analysis to inform decision-making is an essential aspect of this role. As part of your duties, you will lead a culture of continuous improvement by enhancing existing databases, data collection methods, statistical methods, technology, procedures, and training. You will partner with data custodians and process experts to ensure the quality of data and definitions to support the building of reliable data models and analysis. Additionally, coaching and developing junior team members will be a key component of this role. To qualify for this position, you should have 8+ years of relevant work experience in Python, Advanced SQL, R, Azure, Databricks, PySpark, and PowerBI. A strong knowledge and experience in advanced analytics tools and languages to analyze large data sets from multiple sources are required. A BTech in Computer Science, IT, MSc in Mathematics/Statistics, or equivalent courses from an accredited university is necessary. You should possess a strong understanding of algorithms, mathematical models, statistical techniques, data mining, and experience implementing statistical and machine learning models. Experience with analyzing accounting and other financial data, as well as demonstrating excellent ability to exercise discretion and maintain confidentiality, are essential. Knowledge and experience with data transformation and cleansing, data modeling, and database concepts are also preferred.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

18 - 27 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Job Description: Design, implement, and maintain data pipelines and data integration solutions using Azure Synapse Develop and optimize data models and data storage solutions on Azure Collaborate with data scientists and analysts to implement data processing and data transformation tasks. Ensure data quality and integrity through data validation and cleansing methodologies. Monitor and troubleshoot data pipelines to identify and resolve performance issues Collaborate with cross-functional teams to understand and prioritize data requirements. Stay up-to-date with the latest trends and technologies in data engineering and Azure services. Skills & Qualifications: Bachelors degree in IT, computer science, computer engineering, or similar 8+ years of experience in Data Engineering. Microsoft Azure Synapse Analytics experience is essential. (Azure Data Factory, Dedicated SQL Pool, Lake Database, Azure Storage) Hands on experience in Spark notebooks(python or Scala) is mandatory End-to-end Data Warehouse experience: Ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and Data Security. Advanced SQL/relational database knowledge and query authoring. Demonstrated experience in design and delivering data platforms for Business Intelligence and Data Warehouse. Strong skills in handling and analysing complex, high volume data with excellent attention in details. Knowledge of data modelling and data warehousing concepts, such as DataVault or 3NF. Experience with Data Governance (Quality, Lineage, Data dictionary, and Security). Familiar with Agile methodology and Agile working environment. Ability to work alone with POs, BAs, Architects.

Posted 1 month ago

Apply

2.0 - 7.0 years

8 - 18 Lacs

Gurugram

Work from Office

About this role: career opportunities for a Data Analyst within the Chief Data and Analytics group. This role will provide support for business processes (including but not limited to Enterprise, Account and Person data management) with focus on building data management and analytics solution to improve Data Quality at scale What youll do: Analyse Data to define and create Data quality metrics to support strategic decision making Drive data quality initiatives on improving the accuracy of our client data thats leveraged in critical business decisions throughout the enterprise while curating data insights that will ultimately improve the transparency and value of data across Gartner. Complete analysis interpreting results using a variety of techniques, ranging from simple data aggregation to more complex statistical analysis Create executive level Dashboards to present Data quality metrics Collaborate across business and technical teams, both on site and offshore, to create business deliverables such as Data flow diagrams, Business Requirements, Functional Requirements Obtain understanding of relevant business area(s), technical options, limitations, costs and risks to communicate tradeoffs and recommend solutions or suggest alternatives to proposed solutions to shape requirements Independently drive critical data workshops with business and IT stakeholders to develop requirements and execution process while managing dependencies across teams Collaborate and provide support to data scientists in the team with the right data insights to help them build AI models for better data quality What youll need: 2-4 years of experience as a Data Analyst; prior experience in working in Data Warehousing, Business Intelligence, Master Data Management, Analytics environment a plus Advanced SQL skills are mandatory Strong Python skills required Dashboard build experience (PowerBI, Tableau) is required to present data insights Strong analytical, strategic thinking and problem-solving skills including ability to clearly and concisely gather, interpret, analyze and document Business Process, User and Data Functional requirements in a structured way Understanding of data analysis tools and techniques is required Ability to breakdown complex business problems and workflows into meaningful components that are understandable by various levels Well versed with in utilizing tools such MS Excel, MS Word, JIRA, Confluence Knowledge and experience with Scrum/Agile methodology Expert level communication with both technical and non-technical personnel, both oral and written. Who you are: Effective time management skills and ability to meet deadlines Excellent communications skills interacting with technical and business audiences Excellent organization, multitasking, and prioritization skills Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly Intellectual curiosity, passion for technology and keeping up with new trends Delivering project work on-time within budget with high quality

Posted 1 month ago

Apply

2.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

About the Team As Business Analysts, its on us to dive into data and derive insights from it. These then become actionable solutions in the form of changes, improvements, upgrades and new features. As a Business Analyst at Meesho, you will play a crucial role in identifying, improving, and developing technology solutions that drive our strategic goals. This is a tremendous opportunity to learn about high-priority initiatives and collaborate with colleagues throughout the firm and across teams. We work at the intersection of business and technology, continuously developing our leadership, management and communication skills in the process. The exact team you will be working with will be decided during or after the hiring process. Regardless, you are sure to learn and grow and have fun doing so too. Each of our teams at Meesho has its own fun rituals from casual catch-ups to bar hopping, movies nights, and games. So, join us! About the Role As a Senior Business Analyst, you will work on improving the reporting tools, methods, and processes of the team you are assigned to. You will also create and deliver weekly, monthly, and quarterly metrics critical for tracking and managing the business. You will manage numerous requests concurrently and strategically, prioritising them when necessary. You will actively engage with internal partners throughout the organisation to meet and exceed customer service levels and transport-related KPIs. You will brainstorm simple, scalable solutions to difficult problems, and seamlessly manage projects under your purview. You will maintain excellent relationships with our users and in fact, advocate for them while keeping in mind the business goals of your team. What you will do Create various algorithms for optimizing demand and supply data Conduct analysis and solution-building based on insights captured from data Give insights to management and help in strategic planning Analyze metrics, key indicators and other available data sources to discover root causes of process defects Support business development and help to create efficient designs and solution processes Determine efficient utilization of resources Research and implement cost reduction opportunities Must have skills MBA in any discipline 2+ years of experience as a Business Analyst Proficiency in Advanced Excel and Advanced SQL (must-have) and Python(must have) Understanding of basic statistics and probability concepts Proven problem-solving skills

Posted 1 month ago

Apply

3.0 - 8.0 years

16 - 30 Lacs

Bengaluru

Work from Office

Role & responsibilities Design, develop, and optimize complex SQL queries, stored procedures, and data models for Oracle-based systems Create and maintain efficient data pipelines for extract, transform, and load (ETL) processes using Informatica or Python Implement data quality controls and validation processes to ensure data integrity Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications Document database designs, procedures, and configurations to support knowledge sharing and system maintenance Troubleshoot and resolve database performance issues through query optimization and indexing strategies Preferred candidate profile 3+ years of experience with Oracle databases, including advanced SQL & PLSQL development Strong knowledge of data modelling principles and database design Proficiency with Python for data processing and automation Experience implementing and maintaining data quality controls

Posted 1 month ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Navi Mumbai, Pune, Mumbai (All Areas)

Hybrid

Job Description : Job Overview: We are seeking a highly skilled Data Engineer with expertise in SQL, Python, Data Warehousing, AWS, Airflow, ETL, and Data Modeling . The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines, ensuring efficient data processing and integration across various platforms. This role requires strong problem-solving skills, an analytical mindset, and a deep understanding of modern data engineering frameworks. Key Responsibilities: Design, develop, and optimize scalable data pipelines and ETL processes to support business intelligence, analytics, and operational data needs. Build and maintain data models (conceptual, logical, and physical) to enhance data storage, retrieval, and transformation efficiency. Develop, test, and optimize complex SQL queries for efficient data extraction, transformation, and loading (ETL). Implement and manage data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) for structured and unstructured data storage. Work with AWS, Azure , and cloud-based data solutions to build high-performance data ecosystems. Utilize Apache Airflow for orchestrating workflows and automating data pipeline execution. Collaborate with cross-functional teams to understand business data requirements and ensure alignment with data strategies. Ensure data integrity, security, and compliance with governance policies and best practices. Monitor, troubleshoot, and improve the performance of existing data systems for scalability and reliability. Stay updated with emerging data engineering technologies, frameworks, and best practices to drive continuous improvement. Required Skills & Qualifications: Proficiency in SQL for query development, performance tuning, and optimization. Strong Python programming skills for data processing, automation, and scripting. Hands-on experience with ETL development , data integration, and transformation workflows. Expertise in data modeling for efficient database and data warehouse design. Experience with cloud platforms such as AWS (S3, Redshift, Lambda), Azure, or GCP. Working knowledge of Airflow or similar workflow orchestration tools. Familiarity with Big Data frameworks like Hadoop or Spark (preferred but not mandatory). Strong problem-solving skills and ability to work in a fast-paced, dynamic environment. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 1 month ago

Apply

3.0 - 8.0 years

8 - 14 Lacs

Nashik

Work from Office

Architect and develop apps (.NET Core, Angular, SQL Server) Design and optimize database schemas Ensure coding/testing best practices Conduct code reviews Collaborate with cross-functional teams Stay updated with new tech Troubleshoot complex issues

Posted 2 months ago

Apply

3.0 - 8.0 years

12 - 14 Lacs

Mumbai

Work from Office

Design, develop, and implement data solutions using Azure Data Stack components . Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark. Implement data solutions using Azure Data Stack components .

Posted 2 months ago

Apply

5.0 - 10.0 years

12 - 15 Lacs

Pune

Work from Office

Design, develop, and implement data solutions using Azure Data Stack components . Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.

Posted 2 months ago

Apply

3.0 - 5.0 years

14 - 18 Lacs

Pune

Work from Office

Proficient in T-SQL for complex database querying and optimization Expertise in Power BI desktop and service for report/dashboard development Hands-on experience with SQL Server database design and management Strong data modeling skills, including dimensional modeling and star schema Ability to transform raw data into meaningful, actionable information Preferred Skills ("Good to Have"): Experience with Azure Data Services (e.g., Azure SQL, Azure Synapse Analytics, Azure Data Factory) Knowledge of data warehousing concepts and best practices Familiarity with ETL processes and data integration tools Understanding of Power BI governance, security, and deployment strategies Exposure to agile software development methodologies Strong problem-solving and analytical skills Excellent communication and stakeholder management abilities Key Responsibilities: Design and develop interactive, visually appealing Power BI dashboards and reports Implement complex data models and DAX calculations to meet business requirements Optimize SQL queries for high performance and scalability Automate data refresh processes and implement data security protocols Collaborate with business stakeholders to understand reporting needs Provide technical guidance and training to end-users Continuously improve dashboard design, functionality, and user experience Stay up-to-date with the latest Power BI and MS-SQL Server features and best practices

Posted 2 months ago

Apply

2.0 - 7.0 years

7 - 17 Lacs

Bengaluru

Work from Office

About this role: Wells Fargo is seeking a Analytics Consultant In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Bachelors degree or higher in a quantitative field such as computer sciences, engineering, applied math, accounting, finance, economics, econometrics with a quantitative emphasis. Experience in one or a combination of the following: Data quality, reporting, analytics. Prior experience in an internal/external consultative role and/or investment banking/corporate banking support experience is preferred. Prior experience in handling small or medium complexity projects individually, complex or large-scale project participation, experience with matrix managed teams is a plus Candidates should have a solid mastery of Advanced SQL, Advanced Excel, Power BI, knowledge in performing ETL tasks using SQL / Python, Microsoft PowerPoint skills The successful candidate will have a very strong interest in developing both business and technical skills while navigating a dynamic business environment. Excellent verbal, written, and interpersonal communication skills Strong analytical skills with high attention to detail and accuracy Experience with and knowledge of relational databases and concepts Ability to interact and build relationships with senior leaders, peers and key support partners. Ability to work creatively, analytically and independently in a dynamic environment. Exceptional oral and written communication skills and ability to communicate effectively with both business unit (non-technical) and development (technical) personnel. Experience documenting processes and reporting workflows Job Expectations: Strong Individual contributor with excellent communication skills for a variety of audiences (other technical staff and senior management) both verbally and in writing Experience in problem analysis, solution implementation, and change management Must make timely and independent judgment decisions while working in a fast-paced and results-driven environment Ability to articulate issues, risks and proposed solutions to various levels of staff and management Capability to multi-task and finish work within strict timelines and provide timely requests for information and follow-up questions Skill in managing relationships with key stakeholders Eagerness to contribute collaboratively on projects and discussions Perpetual interest in learning something new, but being comfortable with not knowing all the answers Attention to detail in both analytics and documentation Connects with customers to understand and document business processes workflow. Develop understanding of business processes and recommend efficiency based on technical and architectural knowledge Provide both technical consulting to business leaders at an appropriate level of information encapsulation Apply critical thinking skills and perform advanced analytics with the goal of solving complex and multi-faceted business problems. Develop reports using Power BI, SQL, Tableau, Python, Excel or various other tools Having SSRS knowledge will be added advantage Verify accuracy of reports, reconciling data and producing data/information within established timelines Generate deep insights through the analysis of data and understanding of operational processes and turn them into actionable recommendations Work with technology partners to managing tables, views and other database structures to adapt to changing business needs and in compliance with embedded IT policies. Answer ad hoc questions from customers including confirming data, populating templates provided to the group or creating new reports/extracts as requested by customers Collaborate with team members to operate at higher level of dimensionality to innovate and bring in game changing ideas to fruition Partner with business stakeholders in the development of key business reporting (e.g. Portfolio level dashboard, Productivity, regulatory reporting etc.) and optimal delivery channels for the distribution of the reporting solutions

Posted 2 months ago

Apply

3 - 8 years

7 - 11 Lacs

Chennai

Work from Office

We are looking for a skilled Intermediate Quest One Identity Manager Developer with 3 to 8 years of experience to join our team. The ideal candidate will have expertise in custom connector development, advanced workflow configurations, and optimizing synchronization processes for large-scale identity management. ### Roles and Responsibility Design and implement custom workflows using designer and Object Browser for complex provisioning tasks. Develop and maintain custom connectors for integrating with external systems using Synchronization Editor and APIs. Write advanced SQL stored procedures, triggers, and custom queries for data reconciliation and manipulation within One Identity’s database. Configure and optimize Job Service and DBQueue to handle high-volume job processing and resolve performance bottlenecks. Develop complex VBScript and PowerShell scripts to implement business logic. Implement and configure role mining and role lifecycle management processes, ensuring role compliance and SoD policy enforcement. Extend the functionality of the Web Portal by customizing UI forms, adding new fields, and configuring specific approval workflows for access requests. Perform advanced troubleshooting using Job Queue Info, analyzing detailed logs, and debugging synchronization and provisioning failures. Implement and maintain the attestation process, ensuring compliance through periodic certification of user roles and entitlements. Lead efforts to implement custom reporting using SSRS or One Identity Reporting Module to deliver access governance insights. Integrate One Identity Manager with cloud services (e.g., Azure AD, AWS IAM) and on-prem applications using custom-developed connectors. ### Job Requirements In-depth knowledge of Quest One Identity Manager architecture, including Application Server, Job Server, and Data Governance Edition. Advanced SQL skills for writing stored procedures, views, and triggers. Proficiency in VBScript, PowerShell, and knowledge of One Identity Manager API. Strong experience with Synchronization Editor for developing custom connectors. Deep understanding of Active Directory, LDAP, HR systems, Azure, and other integrated systems. Familiarity with SoD policies, role mining, and advanced RBAC configuration.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies