Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
9 - 14 Lacs
chennai
Work from Office
Location - Chennai NP - Immediate Exp - 3 years to 8 Years Mandatory Skills - AI, ML, Python, Data Scientist Walk in Drive 18th Sep 2025 at Chennai(In Personal only) Shift Timing: 11 AM – 8 PM (Monday to Friday) Contact - 7976457434
Posted 3 days ago
6.0 - 11.0 years
4 - 8 Lacs
bengaluru
Work from Office
About The Role Data Engineer Location:- Bangalore Experience:- 6 to 12 Years Choosing Capgemini means choosing a place where youll be empowered to shape your career, supported by a collaborative global community, and inspired to reimagine whats possible. Join us in helping leading drive scalable, sustainable growth. Your Role: DevOps;IT operations;Java;Microsoft Azure;PySpark;Python;cloud computing;cloud providers;data analysis;data management;data processing;data science;information technology;multi-paradigm programming;programming; Your Profile: Experience in public cloud;python data science;software development;system administration;technology, Data Engineer What Youll Love About Working Here We value flexibility and support our employees with remote work options and adaptable schedules to maintain a healthy work-life balance. Our inclusive culture brings together diverse professionals committed to growth, innovation, and excellence. Youll have access to continuous learning opportunities and certifications in emerging technologies like cloud and AI. About Us Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transformation to address the evolving needs of customers and citizens. With a strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needsfrom strategy and design to operations. To achieve this, Capgemini draws on the capabilities of its 360,000 team members in more than 50 countries, all driven by the purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization with market-leading capabilities in digital, cloud, and data.
Posted 6 days ago
2.0 - 5.0 years
5 - 9 Lacs
bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Python, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deploymentProfessional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 1 week ago
1.0 - 6.0 years
9 - 13 Lacs
bengaluru
Work from Office
As a Senior AI/ML Engineer at HireNema, youll take ownership of end-to-end AI/ML solutions from research and prototyping to large-scale deployment. You'll be designing algorithms that not only perform at scale but also meet the highest standards for accuracy, transparency, and fairness. If you live and breathe Python and know how to turn data into real-world impact, this role is for you. Key Responsibilities Lead the design, development, and optimization of AI/ML models for candidate evaluation, scoring, and ranking Own the ML pipeline: data ingestion, preprocessing, feature engineering, model training, validation, deployment, and monitoring Build production-grade AI systems in Python with libraries like TensorFlow, PyTorch, Scikit-learn, and Hugging Face Transformers Apply best practices for model explainability, bias detection, and fairness in AI systems Collaborate with product managers and engineers to seamlessly integrate AI into the HireNema platform Mentor junior ML engineers and review their code for performance, scalability, and maintainability Stay ahead of AI/ML advancements and recommend practical adoption strategies Required Skills Qualifications Bachelors, Masters, or PhD in Computer Science, Artificial Intelligence, Data Science, or related fields 2+ years of professional experience in AI/ML with hands-on Python programming expertise Strong knowledge of data structures, algorithms, and software engineering best practices Proven experience in building, training, and deploying ML models at scale (preferably in production environments) Expertise in at least one domain: NLP, recommendation systems, predictive analytics, or computer vision Proficiency with Python data stack: Pandas, NumPy, Matplotlib/Seaborn for EDA Experience with cloud platforms (AWS, GCP, Azure) and MLOps tools (Docker, Kubernetes, MLflow, Airflow) Ability to translate business problems into well-defined ML solutions Nice-to-Have Skills Experience in explainable AI and building trust in AI outputs Knowledge of large language models (LLMs) and prompt engineering Contributions to open-source AI/ML projects Experience in high-volume, high-performance systems What We Offer Ownership of high-impact AI features used by real customers A collaborative, high-performance team culture Access to the latest AI/ML tools, frameworks, and cloud infrastructure Flexible work environment focused on results
Posted 2 weeks ago
15.0 - 20.0 years
17 - 22 Lacs
bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Python, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 3 weeks ago
4.0 - 7.0 years
7 - 12 Lacs
noida
Work from Office
Roles & Responsibilities IMATCH Import/matching engine, ITRACS knowledge, case management, and ISUITE knowledge Cash mananagment + SWIFT + Stored procedure + Crystal report + batch processing + Control-m Full Stack Development experience windows/Linux platform usage knowledge ETL DB2 knowledge Competencies: BFS : Cash Reconciliation Asset Management Most Crtical : iMATC Mandatory Competencies Python - Python Database - Mongo DB Database - SQL Beh - Communication
Posted 4 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
gurugram
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Machine Learning->Python Preferred Skills: Technology->Machine Learning->Python
Posted 4 weeks ago
8.0 - 12.0 years
30 - 35 Lacs
Chennai
Remote
Job Title:- Sr. Python Data Engineer Location:- Chennai & Bangalore (REMOTE) Job Type:- Permanent Employee Experience :- 8 to 12 Years Shift: 2 11 PM Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture
Posted 1 month ago
4.0 - 9.0 years
18 - 20 Lacs
Hyderabad
Hybrid
Position Job Title: Business Intelligence Developer Reports To: Business Intelligence Manager Primary Purpose The BI developer applies business and advanced technical expertise in meeting business data and reporting needs. The position supports business planning by compiling, visualizing, and analyzing business and statistical data from UCWs information systems. The BI developer liaises with various stakeholders across the university to provide them with the data, reporting, and analysis required to make informed data-driven decisions. The Business Intelligence developer will work on projects that will have a significant impact on student, faculty, and staff experience. Specific Responsibilities The BI Developer will at various times be responsible for the following as well as other related duties as assigned to support the business objectives and purpose of the Company. Design relational databases to support business enterprise applications and physical data modeling according to business requirements Gather requirements from various business departments at UCW and transform them into self-serve reports/dashboards for the various business units using Power BI Understand ad-hoc data requirements and convert it into reporting deliverables Contribute to driving reporting automation and simplification to free up time for in-depth analyses Collaborate with internal and external team members, including system architects, software developers, database administrators, and design analysts, to find creative and innovative approaches to enrich business data Provide business and technical expertise for the analytics process, tools, and applications for the University. Identify opportunities that improve data accuracy and efficiency of our processes. Contributes to the development of training materials, documenting processes, and delivering sessions. Develop strategies for data modeling, design, transport, and implementation to meet requirements for metadata management, operational data stores, and ELT/ETL environments Create and test data models for a variety of business data, applications, database structures, and metadata tables to meet operational goals for performance and efficiency Research modern technologies, data modeling methods, and information management systems and recommend changes to company data architectures Contribute to a team environment where all team members consistently experience a sense of belonging and inclusion Position Requirements Competencies: Demonstrated experience in creating complex data models and developing insightful reports and dashboards using Microsoft Power BI Must possess advanced skills in using DAX queries for Power BI Connecting data sources, importing data, cleaning, and transforming data for Business intelligence Knowledge of database management principles and experience working with SQL/MySQL databases Ability to implement row-level security on data along with an understanding of application security layer models in Power BI Ability to translate business requirements into informative reports/visuals A good sense of design that will help communicate data using visually compelling reports and dashboards Experience in ETL (Extract, Transform and Load) processes an asset Experience in being involved in the development of a data warehouse is an asset Data analysis and visualization skills using Python and/or R an asset Strong analytical, problem-solving, and data analysis skills Ability to ensure organizational data privacy and confidentiality Understanding of statistical analysis techniques such as correlation and regression Demonstrated ability to collect data from a variety of sources, synthesize data, produce reports, and make recommendations Ability to manage multiple concurrent tasks and competing demands Education and Experience: Bachelors or masters degree in business, Information Systems, Computer Science, or related discipline Demonstrated experience in using Power BI to create reports, dashboards, and self serve analytics Must have 3+ years of experience in data-specific roles especially in the use of Power BI, Excel, and SQL
Posted 1 month ago
5.0 - 8.0 years
8 - 12 Lacs
Noida
Work from Office
Automation test engineer with experience using AWS and scripting in Python Knowledge of Boto3 framework is required Test engineer should be able to test Infrastructure provisioned using CDK (created and deleted) and also test the full pipeline; scripts to test the persona (role Experience Required: 5 - 8 Yrs Involves execution of testing, monitoring and operational activities of various complexity based on assigned portfolio ensuring adherences to established service levels and standards. Executes identified test programs for a variety of specializations to support effective testing & monitoring of controls within business groups and across the Bank. Understands the business/group strategy and develops and maintains knowledge of end to end processes. Executes testing activities and any other operational activities within required service level agreements or standards. Develops knowledge related to program and/or area of specialty. Develops and maintains effective relationships with internal & external business partners/stakeholders to execute work and fulfill service delivery expectations. Participates in planning and implementation of operational programs and executes within required service level agreements and standards. Supports change management of varying scope and type; tasks typically focused on execution and sustainment activities. Executes various operational activities/requirements to ensure timely, accurate, and efficient service delivery. Ensures consistent, high quality practices/work and the achievement of business results in alignment with business/group strategies and with productivity goals. Analyzes automated test results and provides initial feedback on test results. Analyzes root causes of any errors discovered to provide for effective communication of issues to appropriate parties. Develops insights and recommends continuous improvement insights based on test results. Creates and maintains adequate monitoring support documentation, such as narratives, flowcharts, process flows, testing summaries, etc. to support the results of the reviews, including the write up of findings/issues for reporting. Mandatory Competencies QE - Test Automation Preparation Beh - Communication QA/QE - QA Automation - Python Data Science and Machine Learning - Data Science and Machine Learning - Python Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate
Posted 1 month ago
4.0 - 7.0 years
7 - 12 Lacs
Noida
Work from Office
Roles & Responsibilities IMATCH Import/matching engine, ITRACS knowledge, case management, and ISUITE knowledge Cash mananagment + SWIFT + Stored procedure + Crystal report + batch processing + Control-m Full Stack Development experience windows/Linux platform usage knowledge ETL DB2 knowledge Competencies: BFS : Cash Reconciliation Asset Management Most Crtical : iMATC Mandatory Competencies Python - Python Database - Mongo DB Database - SQL Beh - Communication
Posted 2 months ago
8.0 - 10.0 years
13 - 18 Lacs
Chennai
Work from Office
Core Qualifications 12+ years in software/data architecture with hands on experience. Agentic AI & AWS Bedrock (MustHave): Demonstrated handson design, deployment, and operational experience with Agentic AI solutions leveraging AWS Bedrock and AWS Bedrock Agents . Deep expertise in cloud-native architectures on AWS (compute, storage, networking, security). Proven track record defining technology stacks across microservices, event streaming, and modern data platforms (e.g., Snowflake, Databricks). Proficiency with CI/CD and IaC (Azure DevOps, Terraform). Strong knowledge of data modeling, API design (REST/GraphQL), and integration patterns (ETL/ELT, CDC, messaging). Excellent communication and stakeholder-management skillsable to translate complex tech into business value. Preferred Media or broadcasting industry experience. Familiarity with Salesforce, or other enterprise iPaaS solutions. Certifications: AWS/Azure/GCP Architect , Salesforce Integration Architect , TOGAF . Mandatory Skills: Generative AI.
Posted 2 months ago
6.0 - 9.0 years
0 - 2 Lacs
Bengaluru
Work from Office
Manager- Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Corporate and Investment Banking & Investment Research (CIB & IR) As a global leader in knowledge processes, research, and analytics, youll be working with a team that specializes in global market research, working with the top-rated investment research organizations, bulge bracket investment banks, and leading asset managers. We cater to 8 of the top 10 global banks, working alongside their product and sector teams, supporting them on deal origination, execution, valuation, and transaction advisory -related projects. What you will be doing at Evalueserve Construct analytical dashboards from alternative data use cases, such as sector or thematic and financial KPI dashboards. Load and import data into internal warehouses through Azure Blob Storage and/or S3 deliveries, SFTP, and other ingestion mechanisms. Design and implement ETL workflows for preprocessing of transactional and aggregated datasets including complex joins, window functions, aggregations, bins and partitions. Manipulate and enhance time series datasets into relational data stores. Implement and refine panels in transactional datasets and relevant panel normalization. Conduct web scraping, extraction and post-processing of numerical data from web-based datasets. What were looking for Previous experience working within fundamental equity investment workflows, such as exposure to financial modeling. High proficiency in SQL and the Python data stack (pandas, numpy, sklearn). Experience working with scheduling and execution platforms, such as Airflow, Prefect, or similar scheduled DAG frameworks. Understanding of efficient query management in Snowflake, DataBricks, or equivalent platforms. Optional familiarity with automation of workflows that produce Excel outputs, such as through openpyxl. Optional familiarity with integrations and import/exports to REST/gRPC/GraphQL APIs. Security: This role is performed in a dedicated, secure workspace Travel: Annual travel to the U.S. for onsite collaboration is expected. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-powered supply chain optimization solution built on Google Cloud. How Evalueserve is now Leveraging NVIDIA NIM to enhance our AI and digital transformation solutions and to accelerate AI Capabilities . Know more about ho w Evalueserve has climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 2 months ago
0.0 - 1.0 years
0 - 2 Lacs
Kochi
Work from Office
Responsibilities: * Collaborate with cross-functional teams on projects * Test and debug code * Develop Python applications using Flask framework * Maintain code quality and documentation standards
Posted 2 months ago
2.0 - 5.0 years
5 - 9 Lacs
Gurugram
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Primary skills:Technology-Machine Learning-Python Preferred Skills: Technology-Machine Learning-Python
Posted 2 months ago
15.0 - 20.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : Python on AzureMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will engage in the development and configuration of software systems, either managing the entire process or focusing on specific stages of the product lifecycle. Your day will involve applying your extensive knowledge of various technologies, methodologies, and tools to support projects and clients effectively, ensuring that the software solutions meet the required standards and specifications. You will also be responsible for guiding your team through challenges and fostering an environment of collaboration and innovation.Project Requirement:+ Python Data Pipeline (ETL) Development (MUST)+ Hands on experience on writing tech designs (MUST).+ Experience in developing with streaming technologies - Kafka , EventHub, Event grid (MUST)+ Kubernetes deployments, Dev Ops knowledge (MUST)+ Database technology - SQL Server, Cosmos DB (MUST)+ Develop Microservice/API in Nest JS/ Node JS Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Python on Azure.- Strong understanding of software development methodologies.- Experience with version control systems such as Git.- Familiarity with Agile and Scrum methodologies. Additional Information:- The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
12.0 - 22.0 years
20 - 32 Lacs
Pune, Chennai, Bengaluru
Work from Office
LOCATION-BENGALURU, CHENNAI,PUNE NOTE- Immediate Joiners only Core Qualifications 12+ years in software & data architecture with hands-on delivery. Agentic AI & AWS Bedrock (Must-Have): Practical experience designing, deploying, and operating Agentic AI solutions using AWS Bedrock & Bedrock Agents. Cloud-Native AWS Expertise: Deep knowledge across compute, storage, networking, and security. Modern Architectures: Proven success in defining stacks for microservices, event-driven systems, and data platforms (e.g., Snowflake, Databricks). DevOps & IaC: Skilled in CI/CD pipelines and Infrastructure as Code using Azure DevOps & Terraform. Data & Integration: Strong in data modeling, REST/GraphQL API design, ETL/ELT, CDC, and messaging integration. Stakeholder Engagement: Excellent communicator with ability to align tech solutions to business outcomes. Preferred: Experience in media or broadcasting. Familiar with Salesforce or enterprise iPaaS platforms. Certifications: AWS/Azure/GCP Architect, Salesforce Integration Architect, TOGAF Have questions? I'm happy to help just connect with me on 9899080360, email- admin@spearheadps.com
Posted 2 months ago
15.0 - 20.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Python, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 2 months ago
8.0 - 13.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 2 months ago
8.0 - 13.0 years
14 - 19 Lacs
Coimbatore
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 2 months ago
8.0 - 13.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
5 - 9 Lacs
Mumbai
Work from Office
3+ years experience of building software solutions using Python Strong fundamentals of Python like Python Data Layout, Generators, Decorators, File IO, Dynamic Programming, Algorithms etc. Working knowledge of Python Standard Libraries and libraries like any ORM library, numpy, scipy, matplotlib, mlab etc. Knowledge of fundamental design principles to build a scalable application Knowledge of Python web frameworks Working knowledge of core Java is added plus Knowledge of web technologies (HTTP, JS) is added plus A financial background will be added plus Any technical capabilities in the area of big data analytics is also added plus Salary Package: As per the industry standard Preferred Programs: BE or BTech or equivalent degree with strong Mathematics and Statistics foundation (example, B.Sc. or M.Sc. in Mathematics & Computer Science)
Posted 2 months ago
6.0 - 8.0 years
15 - 20 Lacs
Mumbai, Chennai, Bengaluru
Work from Office
Strong Python programming skills with expertise in Pandas, Lxml, ElementTree, File I/O operations, Smtplib, and Logging libraries Basic understanding of XML structures and ability to extract key parent & child xml tag elements from XML data structure Required Candidate profile Java Spring Boot API Microservices - 8+ years of exp + SQL 5+ years of exp + Azure 3+ years of experience
Posted 3 months ago
6.0 - 8.0 years
15 - 22 Lacs
Mumbai
Work from Office
Strong Python programming skills with expertise in Pandas, Lxml, ElementTree, File I/O operations, Smtplib, and Logging libraries Basic understanding of XML structures and ability to extract key parent & child xml tag elements from XML data structure Required Candidate profile Java Spring Boot API Microservices - 8+ years of exp + SQL 5+ years of exp + Azure 3+ years of experience
Posted 3 months ago
6.0 - 10.0 years
8 - 18 Lacs
Bengaluru
Work from Office
Team: Development - Alpha Data Position Overview: We are seeking an experienced Python developer to join our Alpha Data team, responsible for delivering a vast quantity of data served to users worldwide. You will be a cornerstone of a growing Data team, becoming a technical subject matter expert and developing strong working relationships with quant researchers, traders, and fellow colleagues across our Technology organisation. Alpha Data teams are able to deploy valuable data to the rest of the Squarepoint business at speed. Ingestion pipelines and data transformation jobs are resilient and highly maintainable, while the data models are carefully designed in close collaboration with our researchers for efficient query construction and alpha generation. We achieve an economy of scale through building new frameworks, libraries, and services used to increase the team's quality of life, throughput, and code quality. Teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creative solutions are valued. Our emphasis is on a culture of learning, development, and growth. Take part ownership of our ever-growing estate of data pipelines, Propose and contribute to new abstractions and improvements - make a real positive impact across our team globally, Design, implement, test, optimize and troubleshoot our data pipelines, frameworks, and services, Collaborate with researchers to onboard new datasets, Regularly take the lead on production support operations - during normal working hours only. Required Qualifications: 5+ years of experience coding to a high standard in Python, React, Javascript Bachelor's degree in a STEM subject, Experience with and knowledge of SQL, and one or more common RDBMS systems (we mostly use Postgres), Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3), Excellent communication skills. Nice to haves Experience with big data frameworks, databases, distributed systems, or Cloud development. Experience with any of these: C++, kdb+/q, Rust.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |