Jobs
Interviews

24 Python Data Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

30 - 35 Lacs

Chennai

Remote

Job Title:- Sr. Python Data Engineer Location:- Chennai & Bangalore (REMOTE) Job Type:- Permanent Employee Experience :- 8 to 12 Years Shift: 2 11 PM Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture

Posted 4 days ago

Apply

4.0 - 9.0 years

18 - 20 Lacs

Hyderabad

Hybrid

Position Job Title: Business Intelligence Developer Reports To: Business Intelligence Manager Primary Purpose The BI developer applies business and advanced technical expertise in meeting business data and reporting needs. The position supports business planning by compiling, visualizing, and analyzing business and statistical data from UCWs information systems. The BI developer liaises with various stakeholders across the university to provide them with the data, reporting, and analysis required to make informed data-driven decisions. The Business Intelligence developer will work on projects that will have a significant impact on student, faculty, and staff experience. Specific Responsibilities The BI Developer will at various times be responsible for the following as well as other related duties as assigned to support the business objectives and purpose of the Company. Design relational databases to support business enterprise applications and physical data modeling according to business requirements Gather requirements from various business departments at UCW and transform them into self-serve reports/dashboards for the various business units using Power BI Understand ad-hoc data requirements and convert it into reporting deliverables Contribute to driving reporting automation and simplification to free up time for in-depth analyses Collaborate with internal and external team members, including system architects, software developers, database administrators, and design analysts, to find creative and innovative approaches to enrich business data Provide business and technical expertise for the analytics process, tools, and applications for the University. Identify opportunities that improve data accuracy and efficiency of our processes. Contributes to the development of training materials, documenting processes, and delivering sessions. Develop strategies for data modeling, design, transport, and implementation to meet requirements for metadata management, operational data stores, and ELT/ETL environments Create and test data models for a variety of business data, applications, database structures, and metadata tables to meet operational goals for performance and efficiency Research modern technologies, data modeling methods, and information management systems and recommend changes to company data architectures Contribute to a team environment where all team members consistently experience a sense of belonging and inclusion Position Requirements Competencies: Demonstrated experience in creating complex data models and developing insightful reports and dashboards using Microsoft Power BI Must possess advanced skills in using DAX queries for Power BI Connecting data sources, importing data, cleaning, and transforming data for Business intelligence Knowledge of database management principles and experience working with SQL/MySQL databases Ability to implement row-level security on data along with an understanding of application security layer models in Power BI Ability to translate business requirements into informative reports/visuals A good sense of design that will help communicate data using visually compelling reports and dashboards Experience in ETL (Extract, Transform and Load) processes an asset Experience in being involved in the development of a data warehouse is an asset Data analysis and visualization skills using Python and/or R an asset Strong analytical, problem-solving, and data analysis skills Ability to ensure organizational data privacy and confidentiality Understanding of statistical analysis techniques such as correlation and regression Demonstrated ability to collect data from a variety of sources, synthesize data, produce reports, and make recommendations Ability to manage multiple concurrent tasks and competing demands Education and Experience: Bachelors or masters degree in business, Information Systems, Computer Science, or related discipline Demonstrated experience in using Power BI to create reports, dashboards, and self serve analytics Must have 3+ years of experience in data-specific roles especially in the use of Power BI, Excel, and SQL

Posted 5 days ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Automation test engineer with experience using AWS and scripting in Python Knowledge of Boto3 framework is required Test engineer should be able to test Infrastructure provisioned using CDK (created and deleted) and also test the full pipeline; scripts to test the persona (role Experience Required: 5 - 8 Yrs Involves execution of testing, monitoring and operational activities of various complexity based on assigned portfolio ensuring adherences to established service levels and standards. Executes identified test programs for a variety of specializations to support effective testing & monitoring of controls within business groups and across the Bank. Understands the business/group strategy and develops and maintains knowledge of end to end processes. Executes testing activities and any other operational activities within required service level agreements or standards. Develops knowledge related to program and/or area of specialty. Develops and maintains effective relationships with internal & external business partners/stakeholders to execute work and fulfill service delivery expectations. Participates in planning and implementation of operational programs and executes within required service level agreements and standards. Supports change management of varying scope and type; tasks typically focused on execution and sustainment activities. Executes various operational activities/requirements to ensure timely, accurate, and efficient service delivery. Ensures consistent, high quality practices/work and the achievement of business results in alignment with business/group strategies and with productivity goals. Analyzes automated test results and provides initial feedback on test results. Analyzes root causes of any errors discovered to provide for effective communication of issues to appropriate parties. Develops insights and recommends continuous improvement insights based on test results. Creates and maintains adequate monitoring support documentation, such as narratives, flowcharts, process flows, testing summaries, etc. to support the results of the reviews, including the write up of findings/issues for reporting. Mandatory Competencies QE - Test Automation Preparation Beh - Communication QA/QE - QA Automation - Python Data Science and Machine Learning - Data Science and Machine Learning - Python Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 12 Lacs

Noida

Work from Office

Roles & Responsibilities IMATCH Import/matching engine, ITRACS knowledge, case management, and ISUITE knowledge Cash mananagment + SWIFT + Stored procedure + Crystal report + batch processing + Control-m Full Stack Development experience windows/Linux platform usage knowledge ETL DB2 knowledge Competencies: BFS : Cash Reconciliation Asset Management Most Crtical : iMATC Mandatory Competencies Python - Python Database - Mongo DB Database - SQL Beh - Communication

Posted 2 weeks ago

Apply

8.0 - 10.0 years

13 - 18 Lacs

Chennai

Work from Office

Core Qualifications 12+ years in software/data architecture with hands on experience. Agentic AI & AWS Bedrock (MustHave): Demonstrated handson design, deployment, and operational experience with Agentic AI solutions leveraging AWS Bedrock and AWS Bedrock Agents . Deep expertise in cloud-native architectures on AWS (compute, storage, networking, security). Proven track record defining technology stacks across microservices, event streaming, and modern data platforms (e.g., Snowflake, Databricks). Proficiency with CI/CD and IaC (Azure DevOps, Terraform). Strong knowledge of data modeling, API design (REST/GraphQL), and integration patterns (ETL/ELT, CDC, messaging). Excellent communication and stakeholder-management skillsable to translate complex tech into business value. Preferred Media or broadcasting industry experience. Familiarity with Salesforce, or other enterprise iPaaS solutions. Certifications: AWS/Azure/GCP Architect , Salesforce Integration Architect , TOGAF . Mandatory Skills: Generative AI.

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 - 2 Lacs

Bengaluru

Work from Office

Manager- Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Corporate and Investment Banking & Investment Research (CIB & IR) As a global leader in knowledge processes, research, and analytics, youll be working with a team that specializes in global market research, working with the top-rated investment research organizations, bulge bracket investment banks, and leading asset managers. We cater to 8 of the top 10 global banks, working alongside their product and sector teams, supporting them on deal origination, execution, valuation, and transaction advisory -related projects. What you will be doing at Evalueserve Construct analytical dashboards from alternative data use cases, such as sector or thematic and financial KPI dashboards. Load and import data into internal warehouses through Azure Blob Storage and/or S3 deliveries, SFTP, and other ingestion mechanisms. Design and implement ETL workflows for preprocessing of transactional and aggregated datasets including complex joins, window functions, aggregations, bins and partitions. Manipulate and enhance time series datasets into relational data stores. Implement and refine panels in transactional datasets and relevant panel normalization. Conduct web scraping, extraction and post-processing of numerical data from web-based datasets. What were looking for Previous experience working within fundamental equity investment workflows, such as exposure to financial modeling. High proficiency in SQL and the Python data stack (pandas, numpy, sklearn). Experience working with scheduling and execution platforms, such as Airflow, Prefect, or similar scheduled DAG frameworks. Understanding of efficient query management in Snowflake, DataBricks, or equivalent platforms. Optional familiarity with automation of workflows that produce Excel outputs, such as through openpyxl. Optional familiarity with integrations and import/exports to REST/gRPC/GraphQL APIs. Security: This role is performed in a dedicated, secure workspace Travel: Annual travel to the U.S. for onsite collaboration is expected. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-powered supply chain optimization solution built on Google Cloud. How Evalueserve is now Leveraging NVIDIA NIM to enhance our AI and digital transformation solutions and to accelerate AI Capabilities . Know more about ho w Evalueserve has climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

0 - 2 Lacs

Kochi

Work from Office

Responsibilities: * Collaborate with cross-functional teams on projects * Test and debug code * Develop Python applications using Flask framework * Maintain code quality and documentation standards

Posted 3 weeks ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Gurugram

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Primary skills:Technology-Machine Learning-Python Preferred Skills: Technology-Machine Learning-Python

Posted 3 weeks ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : Python on AzureMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will engage in the development and configuration of software systems, either managing the entire process or focusing on specific stages of the product lifecycle. Your day will involve applying your extensive knowledge of various technologies, methodologies, and tools to support projects and clients effectively, ensuring that the software solutions meet the required standards and specifications. You will also be responsible for guiding your team through challenges and fostering an environment of collaboration and innovation.Project Requirement:+ Python Data Pipeline (ETL) Development (MUST)+ Hands on experience on writing tech designs (MUST).+ Experience in developing with streaming technologies - Kafka , EventHub, Event grid (MUST)+ Kubernetes deployments, Dev Ops knowledge (MUST)+ Database technology - SQL Server, Cosmos DB (MUST)+ Develop Microservice/API in Nest JS/ Node JS Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Python on Azure.- Strong understanding of software development methodologies.- Experience with version control systems such as Git.- Familiarity with Agile and Scrum methodologies. Additional Information:- The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

12.0 - 22.0 years

20 - 32 Lacs

Pune, Chennai, Bengaluru

Work from Office

LOCATION-BENGALURU, CHENNAI,PUNE NOTE- Immediate Joiners only Core Qualifications 12+ years in software & data architecture with hands-on delivery. Agentic AI & AWS Bedrock (Must-Have): Practical experience designing, deploying, and operating Agentic AI solutions using AWS Bedrock & Bedrock Agents. Cloud-Native AWS Expertise: Deep knowledge across compute, storage, networking, and security. Modern Architectures: Proven success in defining stacks for microservices, event-driven systems, and data platforms (e.g., Snowflake, Databricks). DevOps & IaC: Skilled in CI/CD pipelines and Infrastructure as Code using Azure DevOps & Terraform. Data & Integration: Strong in data modeling, REST/GraphQL API design, ETL/ELT, CDC, and messaging integration. Stakeholder Engagement: Excellent communicator with ability to align tech solutions to business outcomes. Preferred: Experience in media or broadcasting. Familiar with Salesforce or enterprise iPaaS platforms. Certifications: AWS/Azure/GCP Architect, Salesforce Integration Architect, TOGAF Have questions? I'm happy to help just connect with me on 9899080360, email- admin@spearheadps.com

Posted 4 weeks ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Python, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 1 month ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 1 month ago

Apply

8.0 - 13.0 years

14 - 19 Lacs

Coimbatore

Work from Office

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 1 month ago

Apply

8.0 - 13.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Mumbai

Work from Office

3+ years experience of building software solutions using Python Strong fundamentals of Python like Python Data Layout, Generators, Decorators, File IO, Dynamic Programming, Algorithms etc. Working knowledge of Python Standard Libraries and libraries like any ORM library, numpy, scipy, matplotlib, mlab etc. Knowledge of fundamental design principles to build a scalable application Knowledge of Python web frameworks Working knowledge of core Java is added plus Knowledge of web technologies (HTTP, JS) is added plus A financial background will be added plus Any technical capabilities in the area of big data analytics is also added plus Salary Package: As per the industry standard Preferred Programs: BE or BTech or equivalent degree with strong Mathematics and Statistics foundation (example, B.Sc. or M.Sc. in Mathematics & Computer Science)

Posted 1 month ago

Apply

6.0 - 8.0 years

15 - 20 Lacs

Mumbai, Chennai, Bengaluru

Work from Office

Strong Python programming skills with expertise in Pandas, Lxml, ElementTree, File I/O operations, Smtplib, and Logging libraries Basic understanding of XML structures and ability to extract key parent & child xml tag elements from XML data structure Required Candidate profile Java Spring Boot API Microservices - 8+ years of exp + SQL 5+ years of exp + Azure 3+ years of experience

Posted 1 month ago

Apply

6.0 - 8.0 years

15 - 22 Lacs

Mumbai

Work from Office

Strong Python programming skills with expertise in Pandas, Lxml, ElementTree, File I/O operations, Smtplib, and Logging libraries Basic understanding of XML structures and ability to extract key parent & child xml tag elements from XML data structure Required Candidate profile Java Spring Boot API Microservices - 8+ years of exp + SQL 5+ years of exp + Azure 3+ years of experience

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 18 Lacs

Bengaluru

Work from Office

Team: Development - Alpha Data Position Overview: We are seeking an experienced Python developer to join our Alpha Data team, responsible for delivering a vast quantity of data served to users worldwide. You will be a cornerstone of a growing Data team, becoming a technical subject matter expert and developing strong working relationships with quant researchers, traders, and fellow colleagues across our Technology organisation. Alpha Data teams are able to deploy valuable data to the rest of the Squarepoint business at speed. Ingestion pipelines and data transformation jobs are resilient and highly maintainable, while the data models are carefully designed in close collaboration with our researchers for efficient query construction and alpha generation. We achieve an economy of scale through building new frameworks, libraries, and services used to increase the team's quality of life, throughput, and code quality. Teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creative solutions are valued. Our emphasis is on a culture of learning, development, and growth. Take part ownership of our ever-growing estate of data pipelines, Propose and contribute to new abstractions and improvements - make a real positive impact across our team globally, Design, implement, test, optimize and troubleshoot our data pipelines, frameworks, and services, Collaborate with researchers to onboard new datasets, Regularly take the lead on production support operations - during normal working hours only. Required Qualifications: 5+ years of experience coding to a high standard in Python, React, Javascript Bachelor's degree in a STEM subject, Experience with and knowledge of SQL, and one or more common RDBMS systems (we mostly use Postgres), Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3), Excellent communication skills. Nice to haves Experience with big data frameworks, databases, distributed systems, or Cloud development. Experience with any of these: C++, kdb+/q, Rust.

Posted 1 month ago

Apply

3.0 - 5.0 years

8 - 12 Lacs

Gurugram

Work from Office

Position Summary This Requisition is for the Employee Referral Campaign. We are seeking high-energy, driven, and innovative Data Scientists to join our Data Science Practice to develop new, specialized capabilities for Axtria, and to accelerate the company’s growth by supporting our clients’ commercial & clinical strategies. Job Responsibilities Be an Individual Contributor tothe Data Science team and solve real-world problems using cutting-edge capabilities and emerging technologies. Help clients translate the business use cases they are trying to crack into data science solutions. Provide genuine assistance to users by advising them on how to leverage Dataiku DSS to implement data science projects, from design to production. Data Source Configuration, Maintenance, Document and maintain work-instructions. Deep working onmachine learning frameworks such as TensorFlow, Caffe, Keras, SparkML Expert knowledge in Statistical and Probabilistic methods such as SVM, Decision-Trees, Clustering Expert knowledge of python data-science and math packages such as NumPy , Pandas, Sklearn Proficiency in object-oriented languages (Java and/or Kotlin),Python and common machine learning frameworks(TensorFlow, NLTK, Stanford NLP, Ling Pipe etc Education Bachelor Equivalent - Engineering Master's Equivalent - Engineering Work Experience Data Scientist 3-5 years of relevant experience in advanced statistical and mathematical models and predictive modeling using Python. Experience in the data science space prior relevant experience in Artificial intelligence and machine Learning algorithms for developing scalable models supervised and unsupervised techniques likeNLP and deep Learning Algorithms. Ability to build scalable models using Python, R-Studio, R Shiny, PySpark, Keras, and TensorFlow. Experience in delivering data science projects leveraging cloud infrastructure. Familiarity with cloud technology such as AWS / Azure and knowledge of AWS tools such as S3, EMR, EC2, Redshift, and Glue; viz tools like Tableau and Power BI. Relevant experience in Feature Engineering, Feature Selection, and Model Validation on Big Data. Knowledge of self-service analytics platforms such as Dataiku/ KNIME/ Alteryx will be an added advantage. ML Ops Engineering 3-5 years of experience with MLOps Frameworks like Kubeflow, MLFlow, Data Robot, Airflow, etc., experience with Docker and Kubernetes, OpenShift. Prior experience in end-to-end automated ecosystems including, but not limited to, building data pipelines, developing & deploying scalable models, orchestration, scheduling, automation, and ML operations. Ability to design and implement cloud solutions and ability to build MLOps pipelines on cloud solutions (AWS, MS Azure, or GCP). Programming languages like Python, Go, Ruby, or Bash, a good understanding of Linux, knowledge of frameworks such as Keras, PyTorch, TensorFlow, etc. Ability to understand tools used by data scientists and experience with software development and test automation. Good understanding of advanced AI/ML algorithms & their applications. Gen AI :Minimum of 4-6 years develop, test, and deploy Python based applications on Azure/AWS platforms.Must have basic knowledge on concepts of Generative AI / LLMs / GPT.Deep understanding of architecture and work experience on Web Technologies.Python, SQL hands-on experience.Expertise in any popular python web frameworks e.g. flask, Django etc. Familiarity with frontend technologies like HTML, JavaScript, REACT.Be an Individual Contributor in the Analytics and Development team and solve real-world problems using cutting-edge capabilities and emerging technologies based on LLM/GenAI/GPT.Can interact with client on GenAI related capabilities and use cases.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Role & responsibilities Strong understanding of Azure environment (PaaS, IaaS) and experience in working with Hybrid model At least 1 project experience in Azure Data Stack that involves components like Azure Data Lake, Azure Synapse Analytics, Azure Data Factory, Azure Data Bricks, Azure Analysis Service, Azure SQL DWH Strong hands-on SQL/T-SQL/Spark SQL and database concepts Strong experience in Azure Blob and ADLSGEN2 Strong Knowledge of Azure Key Vault, Managed Identity RBAC Strong experience and understanding of DAX tabular models Experience in Performance Tuning, Security, Sizing and deployment Automation of SQL/Spark Good to have Knowledge in Advanced analytics tools like Azure Machine Learning, Event Hubs and Azure Stream Analytics Good Knowledge on Data Visualization tools Power BI Able to do Code reviews as per organization's Best Practices. Exposure/Knowledge of No-SQL databases. Good hands on experience in Azure Dev ops tools. Should have experience in Multi-site project model, client communication skills String working experience in ingesting data from various data sources and data types Good knowledge in Azure DevOps, understanding of build and release pipelines Good knowledge in push/pull request in Azure Repo/Git repositories Good knowledge in code review and coding standards Good knowledge in unit and functional testing Expert knowledge using advanced calculations using MS Power BI Desktop (Aggregate, Date, Logical, String, Table) Good at creating different visualizations using Slicers, Lines, Pies, Histograms, Maps, Scatter, Bullets, Heat Maps, Tree maps, etc. Exceptional interpersonal and communications (verbal and written) skills Strong communication skills Ability to manage mid-sized teams and customer interaction

Posted 1 month ago

Apply

5.0 - 9.0 years

11 - 16 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking a detail-oriented and knowledgeable Senior QA Engineer specializing in Quality Assurance (QA) with a solid foundation in Business Analysis within Rates Derivatives. In this role, you will contribute to ensuring the quality and accuracy of derivative products, focusing on derivatives, fixed income products, and market data processes. Your expertise will support the organization’s efforts in maintaining high standards in trading and risk management systems, directly impacting operational efficiency and compliance. Software Requirements Required Skills: Proficiency in MS Excel, including advanced functionalities such as Macros Strong working knowledge of MS SQL Server for data querying and management Competence in Python for automation, scripting, and data analysis Experience with automation testing tools and frameworks Basic understanding of version control tools (e.g., Git) is preferred Preferred Skills: Familiarity with business analysis tools and requirements gathering platforms Exposure to cloud data environments or cloud-based testing tools Overall Responsibilities Analyze and validate derivative trade data related to Rates Business, ensuring accuracy in P&L and risk calculations Develop and execute test cases, scripts, and automation processes to verify system integrity and data consistency Collaborate with Quantitative Analysts and Business Teams to understand trading models, market data, and risk methodologies Assist in analyzing and optimizing P&L and risk computation processes Document testing procedures, results, and process improvements to uphold quality standards Support the identification of system flaws or data discrepancies and recommend corrective actions Participate in requirement review sessions to ensure testing coverage aligns with business needs Technical Skills (By Category) Programming Languages (Essential): Python (required for automation and data analysis) Excel macros (VBA) for automation and data manipulation Databases/Data Management (Essential): MS SQL Server (for data querying, validation, and management) Cloud Technologies: Not essential for this role but familiarity with cloud data environments (Azure, AWS) is a plus Frameworks and Libraries: Use of Python data libraries (e.g., pandas, NumPy) is preferred for automation tasks Development Tools and Methodologies: Version control (Git) is preferred Test automation frameworks and scripting practices to ensure repeatability and accuracy Security Protocols: Not specifically applicable; adherence to data confidentiality and compliance standards is required Experience Requirements Minimum of 6+ years in Quality Assurance, Business Analysis, or related roles within the derivatives or financial markets industry Strong understanding of Rates Derivatives, fixed income products, and associated market data Hands-on experience in P&L and risk measurement calculations Proven history of developing and maintaining automation scripts, particularly in Python and Excel Macros Experience working with SQL databases to extract, analyze, and validate data Industry experience in trading, risk, or quantitative teams preferred Day-to-Day Activities Collaborate with Quantitative Analysts, Business Teams, and Developers to understand trade data and risk models Develop, execute, and maintain automation scripts to streamline testing and validation processes Validate trade data accuracy and integrity across systems, focusing on P&L and risk calculations Perform detailed testing of business workflows, ensuring compliance with requirements and risk standards Analyze market data inputs and trade data discrepancies, reporting findings and potential improvements Prepare documentation of test cases, findings, and process documentation for audit and review purposes Participate in daily stand-ups, requirement discussions, and defect review meetings Provide ongoing feedback for system enhancements and automation opportunities Qualifications Bachelor’s degree in Finance, Economics, Computer Science, Information Technology, or related field; equivalent professional experience acceptable Relevant certifications such as CFA, CQF, or ISTQB are preferred Demonstrated experience in QA, Business Analysis, or related roles within financial derivatives markets Commitment to continuous learning in financial products, quantitative methods, and automation techniques Professional Competencies Strong analytical and critical thinking skills essential for understanding complex financial data and risk metrics Effective communicator capable of articulating technical and business issues clearly Collaborative team player able to work across business, technical, and QA teams Adaptability to evolving technology tools, processes, and regulatory requirements Focused on continuous improvement and process efficiency Good time management skills to prioritize tasks in a fast-paced environment

Posted 2 months ago

Apply

5 - 10 years

14 - 19 Lacs

Bengaluru

Work from Office

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 2 months ago

Apply

3 - 8 years

14 - 19 Lacs

Bengaluru

Work from Office

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 2 months ago

Apply

5 - 10 years

6 - 10 Lacs

Bengaluru

Work from Office

Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies