Jobs
Interviews

187 Advance Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Data Analyst with expertise in Advance SQL and Excel, you will be responsible for collaborating with stakeholders to comprehend business requirements, analyze data, and deliver actionable insights through data-driven reports and visualizations. Your role will involve working closely with a dynamic team in Gurgaon DLF, operating in US Shifts (Night Shifts) on all 5 days (WFO). It is essential to have a background in BPO or Call Center Operations Industry domain. To excel in this position, you should possess a Bachelor's or Master's degree in Statistics or Applied Mathematics, or relevant equivalent experience, along with a minimum of 4 years of experience in this field. Proficiency in Advance SQL and Excel is a prerequisite for this role. Join us to leverage your analytical skills and contribute to the success of our team by transforming data into valuable insights and reports.,

Posted 1 day ago

Apply

10.0 - 15.0 years

27 - 32 Lacs

Noida, Gurugram, India

Work from Office

Position Summary Core Objectives in this roleDrive revenue growth at a Key Account, in partnership with onshore Client Partners, and offshore Delivery Leaders, by focusing on Delivery excellence, leading to increased customer satisfaction strategic account management and business development Work Experience 10 +years of relevant delivery experience in a large/midsize IT services/Consulting/Analytics Company. 10 years Delivery Management- Business information management (major) and Commercial Ops (minor) 5+ years of Account Management/ Business Development/ Solutioning experience in the IT services/Consulting industry Extensive Pharma/ Life Science industry experience- commercial & clinical Datasets Tech Skills Azure ADF Informatica Intelligent Cloud Services (IICS) PowerShell Scripting Tidal for Scheduling Azure Cloud Services Advance SQL Databricks (second priority) Job Responsibilities Program Management and Delivery Governance coordination with delivery teams to Represent the voice of the client in front of delivery teams Serve as the glue between onshore-offshore and cross-LOB stakeholders Anticipate and mitigate risks Uncover new areas to add value to the clients business Business Development Activities to grow existing accounts Account Strategy and Tactics Proposal Writing (storyline, slides, documents) Bid Management,Drive to uncover cross-sell, up-sell opportunities Process Activities to ensure smooth financial and project operations Track and follow up on CRM opportunities, Active Projects Prepare reports/deckspipeline, revenue, profitability, invoicing Being successful in this role requires an attitude of ownership of all aspects of a business.It entails working collaboratively in a matrix organization, with the full range of stakeholders Onshore client-facing partners Finance, Legal, Marketing Delivery teams both onshore and offshore Education MBA PG Diploma in Management Behavioural Competencies Project Management Client Engagement and Relationship Building Attention to P&L Impact Capability Building / Thought Leadership Customer focus Technical Competencies Account management Azure Data Factory Data Governance Pharma Data Analytics Delivery Management- BIM/ Cloud Info Management Informatica Azure SQL

Posted 2 days ago

Apply

3.0 - 7.0 years

7 - 12 Lacs

Gurugram

Work from Office

Min 3-5 years of AWS ETL development experience. Must have experience on AWS cloud, EC2, IAM, KMS keys, AWS Lambda, Batch, Terraform/CFT, Eventbridge, Managed Kafka, Kinesis, Glue, PySpark. Understanding of data modelling concepts Required Candidate profile Skills- JAVA(1.8 and above) and AWS (Good to have MuleSoft) Knowledge of Python and other programming languages Call Vikas 8527840989 Email vikasimaginators@gmail.com

Posted 3 days ago

Apply

2.0 - 5.0 years

3 - 8 Lacs

Jaipur

Work from Office

About The Role : Job TitleAssociate Regulatory Reporting Team LocationJaipur, India Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as Markets in Financial Directive MiFID II. This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classesRates, Credit, Commodities, Equities and Foreign Exchange. Key responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in supporting in-house tech requirements in terms of building out reports, macros etc. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process as possible Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regulatory requirements and drive improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection remediation or any other stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs Develop end-user reports and visualizations Your skills and experience 5-8years work experience within an Ops role within financial services. Graduate in Science/Technology/Engg./Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credits, Rates, Foreign Exchange, Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred. How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 days ago

Apply

2.0 - 5.0 years

3 - 8 Lacs

Bengaluru

Work from Office

About The Role : Job TitleAssociate Regulatory Reporting Team LocationBangalore, India Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as Markets in Financial Directive MiFID II. This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classesRates, Credit, Commodities, Equities and Foreign Exchange. Key responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in supporting in-house tech requirements in terms of building out reports, macros etc. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process as possible Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regulatory requirements and drive improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection remediation or any other stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs Develop end-user reports and visualizations Your skills and experience 5-8years work experience within an Ops role within financial services. Graduate in Science/Technology/Engg./Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credits, Rates, Foreign Exchange, Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred. How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 days ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

About The Role : Job TitleRegulatory Reporting Team, NCT LocationBangalore, India Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as MiFID II, EMIR, CFTC and SFTR . This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classesRates, Credit, Commodities, Equities, Loans and Foreign Exchange. Key responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in supporting in-house tech requirements in terms of building out reports, macros etc. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process as possible Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regulatory requirements and drive improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection remediation or any other stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs Develop end-user reports and visualizations Your skills and experience 3-5years work experience within an Ops role within financial services. Graduate in Science/Technology/Engg./Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credits, Rates, Foreign Exchange, Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 days ago

Apply

1.0 - 4.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About The Role : Job TitleRegulatory Reporting, NCT LocationBangalore, India Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as Markets in Financial Directive MiFID II. This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classesRates, Credit, Commodities, Equities and Foreign Exchange. Key responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in supporting in-house tech requirements in terms of building out reports, macros etc. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process as possible Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regulatory requirements and drive improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection remediation or any other stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs Develop end-user reports and visualizations Your skills and experience 2-5 years work experience within an Ops role within financial services. Graduate in Science/Technology/Engg./Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credits, Rates, Foreign Exchange, Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 days ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Chennai

Work from Office

Fipsar is Urgently Hiring Talented Freshers! Job Title: Trainee Data Analyst / Developer (Fresher) Location: Chennai (Velachery) – Onsite Interview & Work Location Salary: 3 LPA Industry: IT / Data Analytics Employment Type: Full-Time Office Timings: Males: 12:00 PM to 9:30 PM Females: 11:00 AM to 8:00 PM Eligibility Criteria: Graduation Year: Only 2024 or 2025 pass-outs Education: Any major/discipline Academic Performance: Minimum 60% in 10th, 12th, and Graduation Experience: Freshers only Location: Open to candidates from anywhere in Tamil Nadu Required Skills: Good programming skills in SQL Strong programming knowledge in Python Excellent problem-solving and analytical skills Good communication and team collaboration skills Eagerness to learn and grow in the Data Science & Data Analytics field Passion to build a long-term career in Data Science & Data Analytics teams Other Requirements: Must attend an in-person interview at our Chennai (Velachery) office Should be ready to work onsite from our Chennai location if selected How to Apply: Interested candidates can apply here: http://nauk.in/eVFGGDs Shortlisted candidates will be contacted for the next steps in the selection process. Additional Perks: Food will be provided Medical insurance coverage up to 3 Lakhs

Posted 4 days ago

Apply

6.0 - 10.0 years

5 - 8 Lacs

Greater Noida

Work from Office

Job Description- • Experience on implementing Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python • Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns • Proficiency in RDBMS, complex SQL, PL/SQL, performance tuning and troubleshoot • Provide resolution to an extensive range of complicated data pipeline related problems • Experience in Data Migration from RDBMS to Snowflake cloud data warehouse • Experience with data security and data access controls and design • Build processes supporting data transformation, data structures, metadata, dependency & workload management • Experience in Snowflake modelling - roles, schema, databases. • Extensive hands-on expertise with Creation of Stored Procedures and Advance SQL. • Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into DBT models. • Develop and enforce best practices for version control, testing, and documentation of DBT models. • Build and manage data quality checks and validation processes within the DBT pipelines. • Ability to optimize SQL queries for performance and efficiency. • Good to have experience in Azure services such as ADF, Databricks, Data pipeline building. • Excellent analytical and problem-solving skills. • Have working experience in an Agile methodology. • Knowledge of DevOps processes (including CI/CD) , PowerBI • Excellent communication skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Hyderabad, Gurugram, Bengaluru

Hybrid

Salary: 30 to 45 LPA Exp: 8 to 11 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 1 week ago

Apply

1.0 - 5.0 years

2 - 5 Lacs

Hyderabad

Work from Office

We require technical skills, business knowledge. The analyst collaborates with business users to identify necessary data, , runs BI queries, and creates visualizations, reports, and dashboards to help extract insights from the analyzed data

Posted 1 week ago

Apply

3.0 - 5.0 years

2 - 2 Lacs

Ranchi

Work from Office

Job Description: MIS Executive Department: Administration Location: Head Office Reporting to: HR & Admin Manager Job Purpose: The MIS Executive is responsible for collecting, managing, and analyzing project data related to road construction activities to support strategic and operational decision-making. This includes managing reports on project progress, resource utilization, inventory, financials, and compliance. Key Responsibilities: Maintain and update daily, weekly, and monthly MIS reports on project activities (e.g., material movement, labor deployment, equipment/machine usage). Collect data from various departments like site, purchase, maintenance, accounts, and HR for consolidation and analysis. Ensure timely and accurate data entry in Excel sheets or ERP systems. Generate DPRs (Daily Progress Reports), WPRs, and monthly status reports for management. Track cost vs. budget, timelines vs. actual progress, and report deviations. Maintain stock movement reports, inward/outward material registers, and consumption sheets. Reconcile physical stock with system reports on a periodic basis. Prepare vendor performance reports and purchase summaries. Support in rate comparisons and vendor evaluations using historical data. Maintain attendance records of staff and laborers across different project sites. Coordinate with site teams and HR for timely wage/salary processing reports. Assist in project cost tracking, expense mapping, and providing inputs for budget forecasting. Ensure proper filing (physical and digital) of all reports, invoices, and supporting documents. Maintain backups and ensure data confidentiality. Liaise with site engineers, accounts, purchase, and senior management for data collection and issue resolution. Coordinate with software/IT teams for ERP or reporting tool issues. Support internal and external audits by providing accurate and timely data. Ensure reports meet company standards and government regulations. Ensure timely submission of bills with all supporting documents for processing. Coordinate with departments to obtain necessary approvals. Key Skills Required: Advanced knowledge of MS Excel (Pivot Tables, VLOOKUP, Dashboards, etc.) Experience with ERP systems (e.g., Tally, SAP, or customized project ERP) Analytical and problem-solving ability Attention to detail and accuracy Good communication and coordination skills Ability to work under deadlines Qualifications & Experience: Graduate in Commerce/Science/Engineering (B.Com/B.Sc/B.Tech) • Diploma in Data Analytics or MIS will be a plus • Minimum 3 - 5 years of experience in a similar role, preferably in the construction or infrastructure sector Other Details: Salary Range: 18,000 to 22,000 per month Working Days/Hours: 6 days/week, 10 AM6:30 PM Employment Type: Full-time

Posted 1 week ago

Apply

3.0 - 7.0 years

7 - 15 Lacs

New Delhi, Gurugram, Delhi / NCR

Hybrid

3 -7 years experience in business analytics or consulting. Expert knowledge of Advanced SQL/SQL Queries Demonstrated experience in unstructured problem solving and strong analytical aptitude. Advanced use of MS Office( Excel, PowerPoint). Required Candidate profile Call Vikas 8527840989 Email vikasimaginators@gmail.com

Posted 1 week ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Pune, Chennai, Bengaluru

Work from Office

Greetings from V2Soft! We are looking for Offshore Developer with below skills. Role : Offshore Developer - Data Engineer. Location : Remote Experience : 8 - 12 Years Skills - Azure, Databricks, Data Factory, Python, PySpark, Advanced SQL, SQL queries. Role & responsibilities Looking for a Sr developer who has hands on experience with Databricks / PySpark , Java , Webservices . Sr. Developer * Must have at least relevant 5 years of IT development experience. * Must have strong analytical and problem-solving skills. * Must have experience in designing solutions, performing code reviews, mentoring junior engineers. * Must have strong SQL and backend experience, and working on data driven projects. * Must have following experience: Python/PySpark SQL/ PL/SQL Databricks * Preferred to have following experience: Java/ C # Azure Kafka Node JS Azure Data Factory Interested Candidates can send your updated resume to : kkumarpn@v2soft.com Regards Kiran

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Lead Software Engineer Backend Were seeking a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, environmental, and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Djang Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 1 week ago

Apply

8.0 - 13.0 years

7 - 11 Lacs

Bengaluru

Work from Office

As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 1 week ago

Apply

8.0 - 13.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 1 week ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Senior Software Engineer Data Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 1 week ago

Apply

7.0 - 12.0 years

12 - 16 Lacs

Bengaluru

Work from Office

We are looking for lead or principal software engineers to join our Data Cloud team. Our Data Cloud team is responsible for the Zeta Identity Graph platform, which captures billions of behavioural, demographic, environmental, and transactional signals, for people-based marketing. As part of this team, the data engineer will be designing and growing our existing data infrastructure to democratize data access, enable complex data analyses, and automate optimization workflows for business and marketing operations. Job Description: Essential Responsibilities: As a Lead or Principal Data Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as HDFS, Spark, Snowflake, Hive, HBase, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in 24/7 on-call rotation (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 7 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark, HDFS, Hive, HBase Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience with web frameworks such as Flask, Django

Posted 1 week ago

Apply

4.0 - 9.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Must have technical skills: 4 years+ onSnowflake advanced SQL expertise 4 years+ on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, 4+ Years experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning etc, 3 years+ on Python advanced Python expertise 3 years+ on any cloud platform AWS preferred hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is bare minimum, 3 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 3+ years with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: Clear communication written and verbal communication, especially with time off, delays in delivery etc. Team Player Works in the team and works with the team, Enterprise Experience Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands ofusers

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru, Karnataka

Work from Office

Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 2 weeks ago

Apply

8.0 - 13.0 years

27 - 32 Lacs

Pune

Work from Office

You will: Deliver relevant insights that inform and influence strategic corporate programs through the derivation and curation of main performance metrics Develop an understanding of both business processes and the technologies that enable them, delivering insights are grounded in operational reality and technical feasibility Help translate complex data into meaningful narratives that inspire executive decision-making and shape future performance. Find success in this role by demonstrating the ability to connect the dots between data, strategy, and outcomes You will be part of the Global Analytics and Insights organization, reporting to the Pre Post Sales Insights team. You will partner with Marketing, Sales Development, Customer Success, Global Support, Go-Live, Compliance, and many other teams to provide detailed analyses and insights What Your Responsibilities Will Be You will lead the value stream for critical business processes, ensuring agreement between insights, operational goals, and the development of scalable, long-term analytical solutions Facilitate and manage cross-functional partnerships, agreement, and execution of process improvements and strategic programs Deliver data-driven insights and performance metrics that inform main decisions and support the successful implementation and measurement of business strategies Engage with senior executives and partners, including major customers, to present findings, influence direction, and ensure solutions address our needs You will be reporting to Senior Director, Value Creation Insights What You'll Need to be Successful Minimum of: 8 years of related experience with a Bachelor's degree, or 5 years with a Master's degree, or Equivalent combination of education and experience Demonstrated advanced SQL skills and data modeling expertise Proven ability to bring in best practices and streamline workflows Create solutions for process improvement and efficiency gains Leverage subject matter expertise (SME) and intellectual curiosity to achieve results

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Hello Jobseekers.... I am Hiring now Adobe Analytics senior enginner role for my client. Location: Bangalore Experienece: 6-12 Years NP: Immediate-30 days Must have: 6+ yrs. of overall experience working in web analytics or a related field Bachelor's/Master's degree in Computer Science with equivalent work experience Solid understanding of online marketing, tools and technology Strong understanding of HTML and web protocols Strong-to-advanced JavaScript skills Passion for the internet domain and use of technology to solve business problems Solid understanding of general business models, concepts and strategies Must be self-motivated, responsive, professional and dedicated to customer success Possess an innovative, problem-solving, and solutions-oriented mindset Exceptional organizational, presentation, and communication skills- both verbal and written Demonstrated ability to learn quickly, be a team player, and manage change effectively Extensive knowledge of Microsoft Office Special consideration given for: Previous experience working with Adobe Adobe Analytics or similar tools Website optimization consulting experience Advanced SQL skills, Data modelling / Data Warehousing skills Web development experience Experience working with Mobile/Media Analytics implementations to the core code level details and development Experience around usage of APIs, hands on development experience on programming languages such as Node.js or Python ERP, Saas, or other software implementation experience Deep vertical industry experience (e.g., Retail, media, financial services, high tech, etc.) Expertise with mobile or social media analytics Kinldy share your resume at chanchal@oitindia.com Or share this job in your network.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 5 to 10 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies