Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 5.0 years
8 - 12 Lacs
Gurugram
Work from Office
Position Summary This Requisition is for the Employee Referral Campaign. We are seeking high-energy, driven, and innovative Data Scientists to join our Data Science Practice to develop new, specialized capabilities for Axtria, and to accelerate the company’s growth by supporting our clients’ commercial & clinical strategies. Job Responsibilities Be an Individual Contributor tothe Data Science team and solve real-world problems using cutting-edge capabilities and emerging technologies. Help clients translate the business use cases they are trying to crack into data science solutions. Provide genuine assistance to users by advising them on how to leverage Dataiku DSS to implement data science projects, from design to production. Data Source Configuration, Maintenance, Document and maintain work-instructions. Deep working onmachine learning frameworks such as TensorFlow, Caffe, Keras, SparkML Expert knowledge in Statistical and Probabilistic methods such as SVM, Decision-Trees, Clustering Expert knowledge of python data-science and math packages such as NumPy , Pandas, Sklearn Proficiency in object-oriented languages (Java and/or Kotlin),Python and common machine learning frameworks(TensorFlow, NLTK, Stanford NLP, Ling Pipe etc Education Bachelor Equivalent - Engineering Master's Equivalent - Engineering Work Experience Data Scientist 3-5 years of relevant experience in advanced statistical and mathematical models and predictive modeling using Python. Experience in the data science space prior relevant experience in Artificial intelligence and machine Learning algorithms for developing scalable models supervised and unsupervised techniques likeNLP and deep Learning Algorithms. Ability to build scalable models using Python, R-Studio, R Shiny, PySpark, Keras, and TensorFlow. Experience in delivering data science projects leveraging cloud infrastructure. Familiarity with cloud technology such as AWS / Azure and knowledge of AWS tools such as S3, EMR, EC2, Redshift, and Glue; viz tools like Tableau and Power BI. Relevant experience in Feature Engineering, Feature Selection, and Model Validation on Big Data. Knowledge of self-service analytics platforms such as Dataiku/ KNIME/ Alteryx will be an added advantage. ML Ops Engineering 3-5 years of experience with MLOps Frameworks like Kubeflow, MLFlow, Data Robot, Airflow, etc., experience with Docker and Kubernetes, OpenShift. Prior experience in end-to-end automated ecosystems including, but not limited to, building data pipelines, developing & deploying scalable models, orchestration, scheduling, automation, and ML operations. Ability to design and implement cloud solutions and ability to build MLOps pipelines on cloud solutions (AWS, MS Azure, or GCP). Programming languages like Python, Go, Ruby, or Bash, a good understanding of Linux, knowledge of frameworks such as Keras, PyTorch, TensorFlow, etc. Ability to understand tools used by data scientists and experience with software development and test automation. Good understanding of advanced AI/ML algorithms & their applications. Gen AI :Minimum of 4-6 years develop, test, and deploy Python based applications on Azure/AWS platforms.Must have basic knowledge on concepts of Generative AI / LLMs / GPT.Deep understanding of architecture and work experience on Web Technologies.Python, SQL hands-on experience.Expertise in any popular python web frameworks e.g. flask, Django etc. Familiarity with frontend technologies like HTML, JavaScript, REACT.Be an Individual Contributor in the Analytics and Development team and solve real-world problems using cutting-edge capabilities and emerging technologies based on LLM/GenAI/GPT.Can interact with client on GenAI related capabilities and use cases.
Posted 6 days ago
5.0 - 9.0 years
11 - 16 Lacs
Chennai
Work from Office
Job Summary Synechron is seeking a detail-oriented and knowledgeable Senior QA Engineer specializing in Quality Assurance (QA) with a solid foundation in Business Analysis within Rates Derivatives. In this role, you will contribute to ensuring the quality and accuracy of derivative products, focusing on derivatives, fixed income products, and market data processes. Your expertise will support the organization’s efforts in maintaining high standards in trading and risk management systems, directly impacting operational efficiency and compliance. Software Requirements Required Skills: Proficiency in MS Excel, including advanced functionalities such as Macros Strong working knowledge of MS SQL Server for data querying and management Competence in Python for automation, scripting, and data analysis Experience with automation testing tools and frameworks Basic understanding of version control tools (e.g., Git) is preferred Preferred Skills: Familiarity with business analysis tools and requirements gathering platforms Exposure to cloud data environments or cloud-based testing tools Overall Responsibilities Analyze and validate derivative trade data related to Rates Business, ensuring accuracy in P&L and risk calculations Develop and execute test cases, scripts, and automation processes to verify system integrity and data consistency Collaborate with Quantitative Analysts and Business Teams to understand trading models, market data, and risk methodologies Assist in analyzing and optimizing P&L and risk computation processes Document testing procedures, results, and process improvements to uphold quality standards Support the identification of system flaws or data discrepancies and recommend corrective actions Participate in requirement review sessions to ensure testing coverage aligns with business needs Technical Skills (By Category) Programming Languages (Essential): Python (required for automation and data analysis) Excel macros (VBA) for automation and data manipulation Databases/Data Management (Essential): MS SQL Server (for data querying, validation, and management) Cloud Technologies: Not essential for this role but familiarity with cloud data environments (Azure, AWS) is a plus Frameworks and Libraries: Use of Python data libraries (e.g., pandas, NumPy) is preferred for automation tasks Development Tools and Methodologies: Version control (Git) is preferred Test automation frameworks and scripting practices to ensure repeatability and accuracy Security Protocols: Not specifically applicable; adherence to data confidentiality and compliance standards is required Experience Requirements Minimum of 6+ years in Quality Assurance, Business Analysis, or related roles within the derivatives or financial markets industry Strong understanding of Rates Derivatives, fixed income products, and associated market data Hands-on experience in P&L and risk measurement calculations Proven history of developing and maintaining automation scripts, particularly in Python and Excel Macros Experience working with SQL databases to extract, analyze, and validate data Industry experience in trading, risk, or quantitative teams preferred Day-to-Day Activities Collaborate with Quantitative Analysts, Business Teams, and Developers to understand trade data and risk models Develop, execute, and maintain automation scripts to streamline testing and validation processes Validate trade data accuracy and integrity across systems, focusing on P&L and risk calculations Perform detailed testing of business workflows, ensuring compliance with requirements and risk standards Analyze market data inputs and trade data discrepancies, reporting findings and potential improvements Prepare documentation of test cases, findings, and process documentation for audit and review purposes Participate in daily stand-ups, requirement discussions, and defect review meetings Provide ongoing feedback for system enhancements and automation opportunities Qualifications Bachelor’s degree in Finance, Economics, Computer Science, Information Technology, or related field; equivalent professional experience acceptable Relevant certifications such as CFA, CQF, or ISTQB are preferred Demonstrated experience in QA, Business Analysis, or related roles within financial derivatives markets Commitment to continuous learning in financial products, quantitative methods, and automation techniques Professional Competencies Strong analytical and critical thinking skills essential for understanding complex financial data and risk metrics Effective communicator capable of articulating technical and business issues clearly Collaborative team player able to work across business, technical, and QA teams Adaptability to evolving technology tools, processes, and regulatory requirements Focused on continuous improvement and process efficiency Good time management skills to prioritize tasks in a fast-paced environment
Posted 3 weeks ago
5 - 10 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 1 month ago
3 - 8 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 1 month ago
6 - 11 years
8 - 13 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Skills : Python Programming/Developer, Python Data Structures, OOPS concepts implementation with Python, Restful API, API Authentication / Usage of Tokens, POSTMAN/ CURL/ Pytest/ Unittest, MYSQL/SQL Any RDBMS, Framework: Django (or) FLASK
Posted 2 months ago
3 - 5 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title Data testing (SQL, ETL) Responsibilities As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: 4-6 years data testing (SQL, ETL) and hands on Python automation Python automationPower BI Preferred Skills: Technology->Data Services Testing->Data Warehouse Testing->ETL tool Additional Responsibilities: Must have skills-SQL, Python data automation Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering * Location of posting is subject to business requirements
Posted 2 months ago
10 - 12 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Machine Learning Good to have skills : NA Minimum 10 year(s) of experience is required Educational Qualification : Fulltime 15 years and above Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Machine LearningGood to Have Skills : No Technology SpecializationJob Requirements :Key Responsibilities :1:Creation, tracking and reporting of project status by working closely with Client 2:Use the different statistical, modelling techniques to increase and optimize the business outcomes 3:Mine and analyze the data using state of the arts methods Develop and implement custom data models and algorithms to apply to datasets 4:Should perform, know how to perform platform engineering Familiar with computer vision and NLP Technical Experience :Minimum 6 yrs of Google Big Query, Composer, Airflow, Vertex AI, Machine Learning, Deep Learning, Python, 1:Strong and hands-on in SQL, Strong understanding of Python data types and Object-oriented programming 2:Strong in Google Big Query, Composer, Airflow, and Vertex AI 3:Understanding of the Machine Learning algorithms Supervised, Semi-supervised and unsupervised 4:Understanding different integrations touch points in ML delivery 5:Should anchor Pilot, MVP, Production Machine learning use case Professional Attributes :1:Good communication 2:Good Leadership skills and team handling skills 3:Analytical skills, presentation skills, ability to work under pressure 4:Should be able to work in shifts whenever required Educational Qualification:Fulltime 15 years and aboveAdditional Info : Qualification Fulltime 15 years and above
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Pune
Work from Office
Python +SQL developer Python Data Integration Engineer Role and Responsibilities:As the Data Integration Engineer, you will play a pivotal role in shaping the future of our data integration engineering initiatives. You will remain actively involved in the technical aspects of the projects. Your responsibilities will includeHands-On ContributionContinue to be hands-on with data integration engineering tasks, including data pipeline development, EL processes, and data integration. Be the go-to expert for complex technical challenges. Integrations ArchitectureDesign and implement scalable and efficient data integration architectures that meet business requirements. Ensure data integrity, quality, scalability, and security throughout the pipeline. Tool ProficiencyLeverage your expertise in Snowflake, SQL, Apache Airflow, AWS, API, CI/CD, DBT and Python to architect, develop, and optimize data solutions. Stay current with emerging technologies and industry best practices. Data QualityMonitor data quality and integrity, implementing data governance policies as needed. Cross-Functional CollaborationCollaborate with data science, data warehousing, analytics, and other cross-functional teams to understand data requirements and deliver actionable insights. Performance OptimizationIdentify and address performance bottlenecks within the data infrastructure. Optimize data pipelines for speed, reliability, and efficiency. Project ManagementOversee end-to-end project delivery, from requirements gathering to implementation. Ensure projects are delivered on time and within scope.QualificationsQualificationsBachelor's degree in Computer Science, Engineering, or related field. Advanced degree is a plus. 4 years of hands-on experience in python programming. 3 years of experience in data engineering with experience in SQL.Preferred Skills: Familiarity with cloud platforms, such as AWS or Azure. Demonstrated experience in designing and developing RESTful APIs. Preferred experience with Snowflake, AWS, Kubernetes (EKS), CI/CD practices, Apache Airflow, and dbt. Good experience in full-stack development Excellent analytical, problem-solving, and decision-making abilities. Strong communication skills, with the ability to articulate technical concepts to non-technical stakeholders. A collaborative mindset, with a focus on team success.
Posted 2 months ago
5 - 10 years
15 - 25 Lacs
Chennai, Bengaluru, Bangalore Rural
Work from Office
5+ years of data engineering experience. Strong skills in Azure, Python, or Scala. Expertise in Apache Spark, Databricks, and SQL. Build scalable data pipelines and optimize workflows. Migrate Spark/Hive scripts to Databricks.
Posted 3 months ago
6 - 10 years
15 - 22 Lacs
Bangarapet, Bengaluru, Bangalore Rural
Work from Office
We're Hiring Data Scientist for Bangalore Location. Job Title: Data Scientist - Python & R Expertise Experience: 6 to 10 Years Location: Bangalore Key Skills: Python, R, Machine Learning, Data Analysis Responsibilities: Develop ML models using Python (scikit-learn, pandas, numpy). Implement regression, classification & time series models. Convert R code to Python and ensure smooth integration. Optimize R code for performance and scalability. Requirements: 6+ years of IT experience (3+ years in Data Science). Strong expertise in Python & R (tidyverse, dplyr, data.table). Excellent problem-solving and communication skills. More Information: +91 93289 09176 | yesha@tekpillar.com
Posted 3 months ago
5 - 10 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Machine Learning Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BE Summary :Stakeholder management Creation, tracking and reporting of project status by working closely with customer Use the different statistical/ modelling techniques to increase and optimize the business outcomes. Roles & Responsibilities: Strong understanding of Python data types and Object-oriented programming. Understanding of the Machine Learning algorithms. (Supervised, Semi-supervised and unsupervised) Understanding different integrations touch points in ML delivery. Should anchor Pilot/MVP/Production Machine learning use case delivery for deployments on cloud as well as open source. Familiar with computer vision and NLP. Mine and analyze the data using state of the arts methods. Develop and implement custom data models and algorithms to apply to datasets. Professional & Technical Skills: Collaborative mindset with strong verbal and written communication skills. Experience to work in Agile and DevOps processes. Open to learn newer technologies. Additional Information: The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful AI-driven solutions. Qualifications BE
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2