Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
6 - 11 Lacs
Gurugram
Work from Office
Job Purpose As a key member of the DTS team, you will primarily collaborate closely with a leading global hedge fund on data engagements. Contribute to a variety of Development initiatives, focusing on cloud migration, automation, and application development, delivering scalable, efficient, and secure solutions, implementing DevOps best practices in multi-cloud environments, with a strong emphasis on Google Cloud Platform (GCP). Desired Skills and Experience Essential skills Bachelor's degree in computer science, Engineering, or a related field. 3+ experience in software development using C#, MSSQL, Python, and GCP/BigQuery. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience in code reviews and maintaining code quality. Ability to mentor and guide junior developers. Key Responsibilities Design and Development: Contribute to the design and development of innovative software solutions that meet business requirements. Application Development: Develop and maintain applications using specified technologies such as C#, MSSQL, Python, and GCP/BigQuery. Code Reviews: Participate in code reviews to ensure high-quality code and adherence to best practices. Troubleshooting: Troubleshoot and resolve technical issues promptly to ensure smooth operation of applications. Collaboration: Collaborate with cross-functional teams to integrate software solutions and achieve project goals. Mentorship: Provide technical guidance and mentorship to junior team members, fostering their growth and development. Key Metrics C#, MSSQL, Python GCP/BigQuery Exposure to Devops Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 13 hours ago
4.0 - 6.0 years
4 - 8 Lacs
Gurugram
Work from Office
As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to migrate scripts from Matlab to Python. Also, work on re-creation data visualizations using Tableau/PowerBI. Desired Skills and Experience Essential skills 4-6 years of experience with data analytics Skilled in Python, PySpark, and MATLAB Working knowledge of Snowflake and SQL Hands-on experience to generate dashboards using Tableau/Power BI Experience working with financial and/or alternative data products Excellent analytical and strong problem-solving skills Working knowledge of data science concepts, regression, statistics and the associated python libraries Interest in quantitative equity investing and data analysis Familiarity with version control systems such as GIT Education: B.E./B.Tech in Computer Science or related field Key Responsibilities Re-write and enhance the existing analytics process and code from Matlab to Python Build a GUI to allow users to provide parameters for generating these reports Store the data in Snowflake tables and write queries using PySpark to extract, manipulate, and upload data as needed Re-create the existing dashboards in Tableau and Power BI Collaborate with the firms research and IT team to ensure data quality and security Engage with technical and non-technical clients as SME on data asset offerings Key Metrics Python, SQL, MATLAB, Snowflake, Pandas/PySpark Tableau, PowerBI, Data Science Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 13 hours ago
10.0 - 15.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Work as part of an Agile development team, taking ownership for one or more services Provides leadership to the Agile Development team, driving technical designs to support business goals Ensuring the entire team exemplifies excellence in design, code, test and operation A willingness to lead by example embracing change and foster a Growth and Learning culture on the team Mentoring team members through code review, design reviews Taking a lead role, working with product owners to help refine the backlog, breaking down features and epics into executable stories Have a high quality software mindset making sure that the code you write works QUALIFICATIONS Education: Bachelors/Masters degree in computer science or equivalent. Mandatory Skills: 10+ years of hands-on software engineering experience Recent experience in Java 17+ Experience in developing REST services. Experience in unit test frameworks. Ability to provide solutions based on business requirements. Ability to collaborate with cross-functional teams. Ability to work with global teams and a flexible work schedule. Must have excellent problem-solving skills and be customer-centric. Excellent communication skills. Preferred Skills: Experience with Microservices, CI/CD, Event Oriented Architectures and Distributed Systems Experience with cloud environments (e.g., Google Cloud Platform, Azure, Amazon Web Services, etc.) Experience leading product oriented engineering development teams is a plus Familiarity with web technologies (e,g,, JavaScript, HTML, CSS), data manipulation (e.g., SQL), and version control systems (e.g., GitHub) Familiarity with DevOps practices/principles, Agile/Scrum methodologies, CI/CD pipelines and the product development lifecycle Familiarity with modern web APIs and full stack frameworks. Experience with Java, ElasticSearch, Kubernetes, Spring, Spring Boot, Experience developing eCommerce systems especially B2B eCommerce is a plus.
Posted 13 hours ago
2.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures. Desired Skills and Experience Essential skills B.Tech/ M.Tech/ MCA with 2-4 years of overall experience. Skilled in Python and SQL. Experience with data modeling, data warehousing, and building data pipelines. Experience working with FTP, API, S3 and other distribution channels to source data. Experience working with financial and/or alternative data products. Experience working with cloud native tools for data processing and distribution. Experience with Snowflake and Airflow. Key Responsibilities Engage with vendors and technical teams to systematically ingest, evaluate, and create valuable data assets. Collaborate with core engineering team to create central capabilities to process, manage and distribute data assts at scale. Apply robust data quality rules to systemically qualify data deliveries and guarantee the integrity of financial datasets. Engage with technical and non-technical clients as SME on data asset offerings. Key Metrics Python, SQL. Snowflake Data Engineering and pipelines Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 13 hours ago
6.0 - 8.0 years
12 - 17 Lacs
Gurugram
Work from Office
As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures. Desired Skills and Experience Essential skills A bachelors degree in computer science, engineering, mathematics, or statistics 6-8 years of experience in a Data Engineering role, with a proven track record of delivering insightful and value add dashboards Experience writing Advanced SQL queries, Python and a deep understanding of relational databases Experience working within an Azure environment Experience with Tableau, Holland Mountain ATLAS is a plus. Experience with master data management and data governance is a plus. Ability to prioritize multiple projects simultaneously, problem solve, and think outside the box Key Responsibilities Develop, test and release Data packages for Tableau Dashboards to support all business functions, including investments, investor relations, marketing and operations Support ad hoc requests, including the ability to write queries and extract data from a data warehouse Assist with the management and maintenance of an Azure environment Maintain a data dictionary, which includes documentation of database structures, ETL processes and reporting dependencies Key Metrics Python, SQL Data Engineering, Azure and ATLAS Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 13 hours ago
5.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are seeking an experienced ETL Data Engineer with expertise in Informatica Intelligent Cloud Services (IICS) and Informatica PowerCenter to support our ongoing and upcoming projects. The ideal candidate will be responsible for designing, developing, and maintaining data integration processes using both IICS and PowerCenter. Proficiency in Oracle is essential, including hands-on experience in building, optimizing, and managing data solutions on the platform. The candidate should have the ability to handle tasks independently , demonstrating strong problem-solving skills and initiative in managing data integration projects. This role involves close collaboration with business stakeholders, data architects, and cross-functional teams to deliver effective data solutions that align with business objectives. Who you are: Basics Qualification: Education: Bachelors in computer/ IT or Similar Mandate Skills: ETL Data Engineer, IICS, Informatica PowerCenter, Nice to have: Unix
Posted 13 hours ago
4.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Job Purpose Evaluate the data governance framework and Power BI environment. Provide recommendations for enhancing data quality, and discoverability, and optimize Power BI performance. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess the complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 13 hours ago
4.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
As a Senior Cloud Platform Back-End Engineer with a strong background in AWS tools and services, you will join the Data & AI Solutions - Engineering team in our Healthcare R&D business. Your expertise will enhance the development and continuous improvement of a critical AWS-Cloud-based analytics platform, supporting our R&D efforts in drug discovery. This role involves implementing the technical roadmap and maintaining existing functionalities. You will adapt to evolving technologies, manage infrastructure and security, design and implement new features, and oversee seamless deployment of updates. Additionally, you will implement strategies for data archival and optimize the data lifecycle processes for efficient storage management in compliance with regulations. Join a multicultural team working in agile methodologies with high autonomy. The role requires office presence at our Bangalore location. Who You Are: University degree in Computer Science, Engineering, or a related field Proficiency using Python, especially with the boto3 library to interact with AWS services programmatically, for infrastructure as a code with AWS CDK and AWS Lambdas Experience with API Development & Management by designing, developing, and managing APIs using AWS API Gateway and other relevant API frameworks. Strong understanding of AWS security best practices, IAM policies, encryption, auditing and regulatory compliance (e.g. GDPR). Experience with Application Performance Monitoring and tracing solutions like AWS CloudWatch, X-Ray, and OpenTelemetry. Proficiency in navigating and utilizing various AWS tools and services System design skills in cloud environment Experience with SQL and data integration into Snowflake Familiarity with Microsoft Entra ID for identity and access management Willingness to work in a multinational environment and cross-functional teams distributed between US, Europe (mostly, Germany) and India Sense of accountability and ownership, fast learner Fluency in English & excellent communication skills
Posted 13 hours ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Attend to Technical Calls inbound from customers on breakdown of system on site. Troubleshooting system issues/error/alert & alarm with sound understanding for maintenance of system/IP with logical and technical skills relevant to Lab Water Solution Product portfolio and significant water filtration knowledge. Inform inhouse for urgent clinical customer support in-site where priority dwells on critical scenario (as hospitals and GSA). Handle Work Order processed by Partners in Field (Field Service Engineers) on relevant fields necessary before closing the WO as valid for billing and unbilled covered task carried out in site. Push Billable Repair Quotes for scheduled Repair on system as they possess potential of an order. Chase Billable Work Order for payment to close before billing cycle. Renewal follow-up of Service contract as they provide ease of repair when maintained periodically by inhouse wrt subscriptions. With the above scope of responsibility, managing workflow of transactional activity. Creating effective communication on process during BCP and internal resource availability. Completing task within SLA agreed timeline and quality metrics and service objectives. Creating critical review on non GXP activities. Implementing changes aligned to change management and strategic requirements. Who You are: years of experience Strong Communication, technical knowledge on instrumentation and equipment, call handing, customer handling, troubleshooting of instrumentation. Preferred Requirements: Power BI, basic AI, teamwork. Problem solving. Stakeholder management.
Posted 13 hours ago
4.0 - 9.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Your Focus will be the area of SAP Security and Authorization. You will be member of Cyber alert response center (CARC) team in the daily operations and continuous improvement of detecting, monitoring and documenting security threats for SAP ABAP and JAVA ERP systems. You will be Using Security Information and Event Management (SIEM) tool you will be responsible for analyzing SAP security authorization alerts and coordinating with the responsible teams to remediate, mitigate or resolve security threats. SIEM monitored SAP production systems must meet and pass the strict audit requirement for KRITIS and NIS2 guidelines. We use SIEM to monitor and identify vulnerability and track the resolution. Although it will be good experience in SIEM, it is not mandatory. You will go trained in SIEM. Therefore, you will need to become familiar with all SAP authorization Listeners and Use cases configured in SIEM Baseline and Template to analyze or assess authorization risk. You will be responsible for monitoring SIEM alerts thru Security and Compliance Monitor and Event Monitor and documenting any findings daily. You will collaborate with global teams a minimum of twice a month to report, perform security assessments, root cause analysis, and review processes etc. You will participate in organizing and facilitating SAP Security Authorization meetings to follow-up, track and discuss the progress or status of reported incidents. You will meet with CARC team daily to report or discuss very high or high security authorization alerts with plans to resolve. In addition, you will participate in meeting/discussion regarding application owners, SAP Security Patching and notes and SAP parameters, SAP Code Vulnerability, and Event monitor (threat detection). Following our continuous improvement concept, you will identify potential improvement areas and design & implement solutions for the handling of security alerts. You will become familiar and utilize the possible solution listed to manage the SIEM Alerts. This will help reduce the overall numbers in HighLevel monthly report. You will be responsible for including comments or providing an explanation for the number of changes related to authorization: Incident Manager and documentation Risk Acceptance (Whitelisting) Risk Acceptance (SAP Role/ SAP Profile) Action & Filters Roadmap SAP Role removal/SAP role modification(Security Team) Work environment: Working in an international, virtual team in close collaboration being able to function as a cross functional backup for other members of the CARC team. Working very closely with SAP authorization team and other cross functional teams Bachelors or Masters degree and minimum 8 years comparable job experience. Experience in SAP Security design, administration, troubleshooting and operational experience is mandatory. Experience with SAP JAVA security and administration. Experienced in Cybersecurity framework for SAP ERP support. Good knowledge of service support processes and SLA Skills: Excellent, fluent communication in English speaking and writing. Strong analytical and conceptional skills Organisational talent and intercultural competence Communicative and Team Player Self-driven and flexible Excellent Microsoft Word, PowerPoint and Excel skills Tools: SAP Security and SAP ERP Excellent Microsoft Word, PowerPoint and Excel Experience with any SIEM tool will be good to have Power BI is a plus
Posted 13 hours ago
8.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
We are seeking a skilled and motivated Team Lead - Automation to oversee the automation of processes within our Finance, Procurement, and HR functions. This role will be responsible for delivering use cases within their portfolio while also engaging in hands-on development. The Team Lead will manage a small team and serve as the primary point of contact for business stakeholders regarding delivery and automation initiatives. Delivery Management: Act as a point of contact for specific business function to deliver the identified automation use cases and performance of the use cases in production. Build and execute deliver plan for the use cases approved for the delivery in alignment with relevant stakeholder Manage capacity planning needed to deliver the automation pipeline. Collaborate with the Project managers to ensure the projects are on track. Manage delivery governances with relevant stakeholder to communicate automation program status, drive escalation and support needs. Collaborate with IT teams to ensure all IT pre-requisites are delivered on time. Technical Management Act as a Technical Lead to design automation solutions for different business problems. Provide technical assistance to developers as and when needed Perform technical governance on the deliverables of the development team. Perform hand on technical development for the critical use cases. Take care of innovations by performing proof of concepts utilizing advanced technologies like AI, ML, LLMs. Operations Management: Responsible for Incident Management, Change Management for all the live bots in scope. Responsible to manage governance and reporting for operations. Stakeholder Management: Should have excellent stakeholder management skills to understand business expectation and deliver it through multiple teams. Drive multiple initiatives with varied skilled stakeholders across different roles Who you are: Education: bachelors or masters degree in computer science. Proficiency in English (Verbal, Written) Proven experience 8-10 years of experience in developing and delivering Robotic Process Automation use cases for functions like Finance, Procurement, HR. Expertise on different RPA tools like Power Automate Desktop, Automation Anywhere/ UiPath Technical Expertise Programming languages (.Net, Java, VB, Python), Database (SQL), OCR Technologies, Excel Macros Expertise in utilizing RPA capabilities to Automate SAP Application, Web Application and Document Data extraction. Experience in managing delivery of automation use cases. Stakeholder Management Experienced in managing internal & external stakeholder effectively. Secondary Skills: Enterprise Architecture Should be well versed with enterprise architecture landscape. Experience in delivering solutions for GBS domains (Finance, Procurement, HR, Customer Experience etc)
Posted 13 hours ago
5.0 - 8.0 years
7 - 11 Lacs
Gurugram
Work from Office
Design, construct, and maintain scalable data management systems using Azure Databricks, ensuring they meet end-user expectations. Supervise the upkeep of existing data infrastructure workflows to ensure continuous service delivery. Create data processing pipelines utilizing Databricks Notebooks, Spark SQL, Python and other Databricks tools. Oversee and lead the module through planning, estimation, implementation, monitoring and tracking. Desired Skills and experience 5+ years of experience in software development using Python, PySpark and its frameworks. Proven experience as a Data Engineer with experience in Azure cloud. Experience implementing solutions using - Azure cloud services, Azure Data Factory, Azure Lake Gen 2, Azure Databases, Azure Data Fabric, API Gateway management, Azure Functions Design, build, test, and maintain highly scalable data management systems using Azure Databricks Strong SQL skills with RDMS or noSQL databases Experience with developing APIs using FastAPI or similar frameworks in Python Familiarity with the DevOps lifecycle (git, Jenkins, etc.), CI/CD processes Good understanding of ETL/ELT processes Experience in financial services industry, financial instruments, asset classes and market data are a plus. Assist stakeholders with data-related technical issues and support their data infrastructure needs. Develop and maintain documentation for data pipeline architecture, development processes, and data governance. Data Warehousing: In-depth knowledge of data warehousing concepts, architecture, and implementation, including experience with various data warehouse platforms. Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients. Excellent problem-solving skills, with ability to work independently or as part of team. Strong communication and interpersonal skills, with ability to effectively engage with both technical and non-technical stakeholders. Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts. Key responsibilities include: Interpret business requirements, either gathered or acquired. Work with internal resources as well as application vendors Designing, developing, and maintaining Data Bricks Solution and Relevant Data Quality rules Troubleshooting and resolving data related issues. Configuring and Creating Data models and Data Quality Rules to meet the needs of the customers. Hands on in handling Multiple Database platforms. Like Microsoft SQL Server, Oracle etc Reviewing and analyzing data from multiple internal and external sources Analyze existing PySpark/Python code and identify areas for optimization. Write new optimized SQL queries or Python Scripts to improve performance and reduce run time. Identify opportunities for efficiencies and innovative approaches to completing scope of work. Write clean, efficient, and well-documented code that adheres to best practices and Council IT coding standards. Maintenance and operation of existing custom codes processes Participate in team problem solving efforts and offer ideas to solve client issues. Query writing skills with ability to understand and implement changes to SQL functions and stored procedures. Effectively communicate with business and technology partners, peers and stakeholders Ability to deliver results under demanding timelines to real-world business problems. Ability to work independently and multi-task effectively. Configure system settings and options and execute unit/integration testing. Develop end-user Release Notes, training materials and deliver training to a broad user base. Identify and communicate areas for improvement. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic. Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 13 hours ago
8.0 - 13.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: Masters degree in computer science, Information Technology, or a related field. At least 3+ years of proven experience as a Snowflake Developer and minimum 8+ years of total experience with data modelling (OLAP & OLTP). Extensive hands-on experience in writing complex SQL queries and advanced Python, demonstrating proficiency in data manipulation and analysis for large data volumes. Strong understanding of data warehousing concepts, methodologies, and technologies with in-depth experience in data modelling techniques (OLTP, OLAP, Data Vault 2.0) Experience building data pipelines using DBT (Data Build Tool) for data transformation. Familiarity with advanced techniques for performance tuning methodologies in Snowflake including query optimization. Strong knowledge with CI/CD pipelines, preferably in Azure DevOps. Excellent problem-solving, analytical, and critical thinking skills. Strong communication, collaboration, and interpersonal skills. Knowledge of additional data technologies (e.g., AWS, Azure, GCP) is a plus. Knowledge of Infrastructure as Code (IAC) tools such as Terraform or cloud formation is a plus. Experience in leading projects or mentoring junior developers is advantageous.
Posted 13 hours ago
8.0 - 13.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: B achelors or masters degree in computer science, mathematics, or related fields. At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist. In depth knowledge of Snowflake components including Security and Governance Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault) A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery. Proven industry experience with DBT and JINJA scripts Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous. Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have. Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus. Experience with Healthcare R&D is a plus. Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers. Experience working in virtual and agile teams.
Posted 13 hours ago
5.0 - 7.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Purpose Develop and execute test cases for both UI and API, with a focus on FSI trading workflows. Implement and utilize test management tools (e.g., X-Ray/JIRA). Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 5+ years of experience in Quality Assurance (Testing) working with client stakeholders Significant experience of performing testing in the Financial Services Industry. Hands-on expertise in UI testing with Puppeteer. Strong experience in API testing using SOAPUI, Maven, and Jenkins in a CI/CD pipeline. Deep understanding of FSI trading platforms and tools (e.g., Polaris) and fixed income products. Proven ability to establish QA processes and frameworks in environments with minimal existing structure. Excellent problem-solving, analytical, and communication skills. Experience working on agile methodology, Jira, Confluence, etc. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and also collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key responsibilities include: Establish and implement comprehensive QA strategies and test plans from scratch. Address immediate pain points in UI (Puppeteer) and API (SOAPUI, Maven, Jenkins) testing, including triage and framework improvement. Strong experience in SQL. Develop and execute test cases for both UI and API, with a focus on Fixed Income trading workflows. Drive the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Implement and utilize test management tools (e.g., X-Ray/JIRA). Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in Financial Services Industry, particularly fixed income trading products and workflows. Basic knowledge on Python and its libraries like Pandas, NumPy, etc. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 13 hours ago
9.0 - 14.0 years
14 - 19 Lacs
Hyderabad
Work from Office
As a Senior Principal Statistical Programmer, you will have the opportunity to work with advanced technical solutions such as R, Shiny, and SAS, allowing you to lead asset teams and mentor junior staff effectively. In this role, you will contribute to global assets across a variety of therapeutic areas, shaping strategic decisions in statistical programming. Your responsibilities will include leading the trial or asset programming team as the Lead Statistical Programmer, ensuring that asset and trial delivery aligns with established timelines and quality standards. You will perform programming activities at both trial and asset levels, including the development of SDTM and ADaM datasets and the creation of specifications. Additionally, you will develop and validate analytical outputs in accordance with the Statistical Analysis Plan and create datasets for integrated analyses like ISS or ISE. You will also be responsible for executing ad-hoc programming activities based on internal and external requests. Actively contributing to statistical programming initiatives, you will support process improvements and innovation while providing expert advice, guidance, and training to trial and asset teams, fostering the development of your colleagues' skills. Who are you: BSc or MSc (in a numerate discipline preferably in Mathematics, Statistics or Computer Science) Proven success in a Statistical Programming role within clinical development at a pharmaceutical or biotech company, or at a CRO, equivalent to a minimum of 9 years directly relevant experience. Experience in an international environment is a plus. Advanced skills in R and SAS Full familiarity of CDISC SDTM and ADaM standards (including specifications, Define.xml, and reviewers guide) and underlying concepts. Strong understanding of processes related to clinical development programs, Experience in leading e-submission processes is beneficial. Demonstrated ability to manage assets effectively, ensuring timely delivery and quality outcomes Ability to provide solutions for complex programming challenges and evaluate alternatives to identify optimal solutions.
Posted 13 hours ago
3.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Building models using best inclass ML technology. Training/fine tuning modelswith new/modified training dataset. Selecting features, buildingand optimizing classifiers using machine learning techniques Data mining usingstate-of-the-art methods Processing, cleansing, andverifying the integrity of data used for analysis Enhancing data collectionprocedures to include information that is relevant for building analyticsystems Technical skills Mandatory 3 to 8 years of experience as MachineLearning Researcher or Data Scientist Graduate in Engineering,Technology along with good business skills Good applied statistics skills,such as distributions, statistical testing, regression, etc. Excellent understanding ofmachine learning techniques and algorithms including knowledge about LLM Experience with NLP. Good scripting and programmingskills in Python Basic understanding of NoSQLdatabases, such as MongoDB, Cassandra Nice to have Exposure to financial researchdomain Experience with JIRA,Confluence Understanding of scrum andAgile methodologies Experience with datavisualization tools, such as Grafana, GGplot, etc. Soft skills Oral and written communicationskills Good problem solving andnegotiation skills Passion, curiosity andattention to detail
Posted 13 hours ago
2.0 - 3.0 years
2 - 5 Lacs
Gurugram
Work from Office
Acuity iscurrently looking for dedicated and motivated individuals who have strongleadership, organizational and teamwork skills for its Business InformationServices team (BIS) based in Gurgaon. Key Responsibilities Supportingthe onshore bankers in meeting their financial and qualitative informationrequirements across multiple sectors such as TMT, Energy, Real Estate,Automotive, Consumer, Healthcare and Banking for various companies and sectorsacross the globe. Tasks include: Information Retrieval: Provision of Company information packs, comprisingcompany filings, broker research, new runs and other specified information. Company and Sector Analysis: Company research and industryspecific or macro-economic research News Runs: Filteringof relevant news related to MA, management, material company announcementsusing google or third party paid sources Market Analysis: Sourcing of market data such as share prices,currency, ratios covering all asset types and products from multiplethird-party data sources Screening Peer identification or MA/ DCM/ ECM deal runs from market data sources Other Research: Researchingtechnical publications, regulatory frameworks, and data and analytical research Otheractivities include structuring deliverables/ teams developing efficientprocesses Managing ashift of junior research analysts and conducting quality control check of theoutgoing reports/packs of juniors Demonstratestrength and experience in client/requester relationship building andmanagement, information/knowledge needs assessment Required Background MBA or PGDMin Finance or equiv. qualification Minimum 2-4years of relevant experience in the Investment banking space Should becomfortable working in rotational shifts Expertknowledge on third-party sector country specific data sources such asBloomberg, Thomson Eikon, Factiva, Capital IQ, MergerMarket, Euromonitor etc. Understanding of financial concepts and awareness around different industries/sectors Strong communication skills to engage with the client and upscale the library work
Posted 13 hours ago
5.0 - 7.0 years
10 - 14 Lacs
Gurugram
Work from Office
We are seeking a highlyskilled and experienced HR Technology Subject Matter Expert (SME) to join ourHR function. The ideal candidate will have hands-on experience working with HRsystems in a global, matrixed environment and will be responsible for optimizingHR technology platforms to enhance process efficiency, user experience, andbusiness outcomes. KEY REPONSIBILITIES: Serve as the go-to expertfor HR technology platforms, providing deep functional and technical knowledgeto HR stakeholders, IT teams, and external partners. Drive HR Technologyadoption Provide consultativesupport to leverage technology, streamline HR processes, and improve employeeexperience. Manage day-to-dayadministration, configuration, and optimization of core HR systems and modulessuch as Core HR, Talent Acquisition, Performance, Learning, Compensation, etc. Support end-to-endlifecycle for system changes including requirement gathering, design,configuration, testing (SIT/UAT), training, deployment, and post-implementationsupport. Work closely withinternal HR teams and external vendors during platform upgrades, patchmanagement, and release cycles to evaluate and adopt new functionalities. Create and execute testscripts, coordinate UAT with HR teams, and document outcomes. Act as the primaryliaison between HR, IT, and vendors for all HR technology-related matters,including incident resolution, SLA management, and enhancement discussions. Provide tier-2/3 supportfor system-related queries and resolve escalations in a timely manner. KEY COMPETENCIES: Master's degree inHuman Resources, Information Systems, Computer Science, or a related field 57 years of experiencein HR technology within a multinational corporate environment. Experience inimplementing, configuring, or managing HR platforms such as Darwinbox(preferred), Oracle, Dev Ops, Pendo and Jira, or similar. Hands-on experience insystem integrations / enterprise transformation Strong understanding ofHR processes and practices including Core HR, Talent Management, Recruitment,Compensation, Learning, and Employee Experience.
Posted 13 hours ago
2.0 - 4.0 years
4 - 7 Lacs
Gurugram
Work from Office
Job Purpose We are seeking a Data Operations Engineer to improve the reliability and performance of data pipeline. Successful candidates will work with researchers, data strategists, operations and engineering teams to establish the smooth functioning of the pipeline sourced from an enormous and continuously updating catalog of vendor and market data. Essential Skills and Experience B.Tech/ M.Tech/ MCA with 1-3 years of overall experience Proficient with Python programming language like as well as common query languages like SQL Experience with the Unix platform and toolset such as bash, git, and regex. Excellent English communication: Oral and Writing Experience in the financial services industry or data science is a plus Critical thinking to dive into complex issues, identify root causes, and suggest or implement solutions A positive team focused attitude and work ethic Key Responsibilities Support the daily operation and monitoring of the pipeline Triage issues in timely manner, monitor real time alerts while also servicing research driven workflows Improve the reliability and operability of pipeline components Solve both business and technical problems around data structure, quality, and availability Interact with external vendors on behalf of internal clients. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards Key Metrics Core Python, Linux Good to have Perl or C++ PromQL Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 13 hours ago
4.0 - 6.0 years
5 - 9 Lacs
Gurugram
Work from Office
Responsible for designing, building, and maintaining enterprise job schedules in BMC Control-M to support the Momentum platform. Works closely with application owners and infrastructure teams to ensure reliable, auditable, and optimized batch processing. Desired Skills and Experience Essential skills 4+ years of hands-on experience with BMC Control-M. Strong knowledge of job flows, alerts, recovery, and batch design best practices. Experience supporting enterprise platforms (e.g., finance, regulatory, or ERP systems). Experience in SQL and PowerShell Ability to work collaboratively in a team supporting mission-critical schedules Boomi ETL is a plus Education: B.E./B.Tech in Computer Science or related field Key Responsibilities Design and implement new Control-M job definitions for Momentum platform processes. Manage dependencies, conditions, and calendars to ensure reliable job execution. Monitor and troubleshoot job failures and coordinate resolutions with app and infra teams. Collaborate on runbook creation, SLA tracking, and job performance tuning. Participate in change control and production deployment cycles. Key Metrics BMC Control M, SQL and PowerShell Experience supporting enterprise platforms and MFT Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 13 hours ago
4.0 - 6.0 years
3 - 6 Lacs
Gurugram
Work from Office
Job Purpose Leads the one-time migration of 35 legacy Windows Task Scheduler jobs into Control-M as part of the Momentum modernization effort. Ensures reliability, standardization, and operational continuity. Desired Skills and Experience Essential skills 4+ years of experience with BMC Control-M and workload automation. Experience migrating jobs from Windows Task Scheduler. Strong scripting (PowerShell and Batch) and dependency analysis skills. Experience in SQL Detail-oriented, with strong documentation and validation discipline. Education: B.E./B.Tech in Computer Science or related field Key Responsibilities Analyze 35+ existing Windows Task Scheduler jobs (triggers, logic, scripts). Rebuild, test, and validate equivalent jobs in Control-M. Apply enterprise scheduling standards including logging, alerts, and audit readiness. Coordinate testing, documentation, and final deployment with the Scheduling team. Support cutover and post-deployment monitoring until stable. Key Metrics BMC Control-M, PowerShell Batch scripting Windows Task Scheduler, MFT SQL Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 13 hours ago
2.0 - 3.0 years
1 - 5 Lacs
Gurugram
Work from Office
Managing data and pipeline status on CRM Updating meeting notes and other deal related data for investment bank team Take new initiatives and share ideas for reporting and managing data on CRM Ability to think critically and analyzing results Assist in database management, coordination with various stakeholders Excellent working knowledge of Deal Cloud CRM Manage project timelines and quality of deliverables in a manner to ensure high client satisfaction Demonstrate strength and flair in client/requester relationship building and management, information/knowledge needs assessment Conducting analysis performing quality control check of the outgoing reports/pack Required background Bachelor of Tech or commerce or equivalent qualification 2+ years of experience in CRM data management, Deal Cloud preferred The candidate should have the ability to work independently Strong communication skills to engage with the client and managing workflow Good MS Office skills
Posted 13 hours ago
7.0 years
0 Lacs
Greater Chennai Area
Remote
Genesys empowers organizations of all sizes to improve loyalty and business outcomes by creating the best experiences for their customers and employees. Through Genesys Cloud, the AI-powered Experience Orchestration platform, organizations can accelerate growth by delivering empathetic, personalized experiences at scale to drive customer loyalty, workforce engagement, efficiency and operational improvements. We employ more than 6,000 people across the globe who embrace empathy and cultivate collaboration to succeed. And, while we offer great benefits and perks like larger tech companies, our employees have the independence to make a larger impact on the company and take ownership of their work. Join the team and create the future of customer experience together. Job Summary The Genesys Data & Analytics Team The Data & Analytics team is a central team comprised of Data Engineering, Data Platform/Technologies, Data Analytics, Data Science, Data Product, and Data Governance practices. This mighty team serves the enterprise that includes sales, finance, marketing, customer success, product and more. The team serves as a core conduit and partner to operational systems that run the business including Salesforce, Workday and more. The IT Manager of Analytics plays a pivotal role within the Enterprise Data & Analytics organization at Genesys. This role is responsible for leading a team of analysts and driving delivery of impactful analytics solutions that support enterprise functions including sales, finance, marketing, customer success, and product teams. This leader will oversee day-to-day analytics operations, coach and mentor a team of analysts, and collaborate closely with stakeholders to ensure alignment of analytics deliverables with business goals. The ideal candidate brings hands-on analytics expertise, a passion for data storytelling, and a track record of managing successful analytics teams. This position offers flexible work arrangements and may be structured as either hybrid or fully remote Responsibilities Lead and mentor a team of analytics professionals, fostering a collaborative and high-performing culture. Promote & drive best practices in analytics, data visualization, automation, governance, and documentation. Translate business needs into actionable data insights through dashboards, visualizations, and storytelling. Partner with enterprise functions to understand goals, define key metrics, and deliver analytics solutions that inform decision-making. Manage and prioritize the team’s project backlog, ensuring timely and quality delivery of analytics products. Collaborate with data engineering and platform teams to ensure scalable and reliable data pipelines and sources. Contribute to the development and maintenance of a shared analytics framework and reusable assets. Advocate for self-service analytics and data literacy across the business. Ensure compliance with data privacy, governance, and security policies. Requirements 7+ years relevant experience with Bachelor's / Master's degree in a natural science (computer science, data science, math, statistics, physics. etc.) Proven ability to lead and inspire analytics teams, delivering results in a fast-paced, cross-functional environment. Strong proficiency in BI and visualization tools (e.g., Looker, Tableau, QuickSight, Power BI). Solid understanding of cloud data platforms and big data ecosystems (e.g., AWS, Snowflake, Databricks). Strong business acumen and the ability to communicate technical concepts clearly to non-technical stakeholders. Experience building and managing stakeholder relationships across multiple departments. Adept at SQL and data modeling principles Experience with statistical scripting languages (Python preferred) Familiarity with Agile methodologies and project management tools (e.g., Jira, Confluence). Demonstrates a results-oriented mindset, take thoughtful risks, and approach challenges with humility and a hands-on, resourceful attitude Preferred Qualifications Creative, innovative and solution design thinking: You evaluate things holistically and think through the objectives, impacts, best practices, and what will be simple and scalable Excellent critical thinking, problem solving and analytical skills with a keen attention to detail. Skilled at running cross-functional relationships and communicating with leadership across multiple organizations Strong team player: ability to lead peers in accomplishment of common goals. If a Genesys employee referred you, please use the link they sent you to apply. About Genesys: Genesys empowers more than 8,000 organizations in over 100 countries to improve loyalty and business outcomes by creating the best experiences for their customers and employees. Through Genesys Cloud, the AI-powered Experience Orchestration platform, Genesys delivers the future of CX to organizations of all sizes so they can provide empathetic, personalized experience at scale. As the trusted platform that is born in the cloud, Genesys Cloud helps organizations accelerate growth by enabling them to differentiate with the right customer experience at the right time, while driving stronger workforce engagement, efficiency and operational improvements. Visit www.genesys.com. Reasonable Accommodations: If you require a reasonable accommodation to complete any part of the application process or are limited in the ability or unable to access or use this online application process and need an alternative method for applying, you or someone you know may reach out to HR@genesys.com. You can expect a response from someone within 24-48 hours. To ensure we set you up with the best reasonable accommodation, please provide them the following information: first and last name, country of residence, the job ID(s) or (titles) of the positions you would like to apply, and the specific reasonable accommodation(s) or modification(s) you are requesting. This email is designed to assist job seekers who seek reasonable accommodation for the application process. Messages sent for non-accommodation-related issues, such as following up on an application or submitting a resume, may not receive a response. Genesys is an equal opportunity employer committed to fairness in the workplace. We evaluate qualified applicants without regard to race, color, age, religion, sex, sexual orientation, gender identity or expression, marital status, domestic partner status, national origin, genetics, disability, military and veteran status, and other protected characteristics. Please note that recruiters will never ask for sensitive personal or financial information during the application phase. Show more Show less
Posted 13 hours ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary Full stack development experience; experience in Java, Angular and SQL; familiarity with Angular 16+, Containerization Technologies, and CI/CD, Git, and Maven. Experience with GCP and Azure DevOps preferred. Experience with Agile Development Good written and verbal communication skills Ability to design solutions for complex problems Must be a team player who shows initiative and is detail-oriented ____________________________ This position provides input and support for full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides knowledge, development, maintenance, and support for applications. Responsibilities Generates application documentation. Contributes to systems analysis and design. Designs and develops moderately complex applications. Contributes to integration builds. Contributes to maintenance and support. Monitors emerging technologies and products. Qualifications Bachelor’s Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 13 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
SQL (Structured Query Language) is a crucial skill in the field of data management and analysis. In India, the demand for professionals with SQL expertise is on the rise, with numerous job opportunities available across various industries. Job seekers looking to break into the IT sector or advance their careers in data-related roles can benefit greatly from acquiring SQL skills.
These cities are known for their thriving IT industries and are hotspots for SQL job openings.
In India, the average salary range for SQL professionals varies based on experience levels. Entry-level positions can expect to earn around ₹3-5 lakhs per annum, while experienced professionals with 5+ years of experience can earn anywhere from ₹8-15 lakhs per annum.
A typical career progression in the SQL domain may include roles such as: - Junior SQL Developer - SQL Developer - Senior SQL Developer - Database Administrator - Data Analyst - Data Scientist
Advancing to higher roles like Tech Lead or Data Architect is possible with increased experience and expertise.
In addition to SQL proficiency, job seekers in India may benefit from having skills such as: - Data analysis and visualization tools (e.g., Tableau, Power BI) - Programming languages (e.g., Python, R) - Knowledge of database management systems (e.g., MySQL, Oracle) - Understanding of data warehousing concepts
Here are 25 SQL interview questions to help you prepare for job interviews in India:
As you explore SQL job opportunities in India, remember to not only focus on mastering SQL but also to develop related skills that can make you a well-rounded professional in the data management field. Prepare thoroughly for interviews by practicing common SQL questions and showcase your expertise confidently. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.