Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
0 - 2 Lacs
Bengaluru
Work from Office
Big Data - Senior Python + Spark Developer Location : Bangalore Experience : 7-12 years Interview Process : L1 : Online interview with external panel Weekday/Weekend L2 : Face-to-face at Tech Mahindra Job Responsibilities : Develop and maintain Python and Spark applications Write clean, efficient, and reusable Python code Work with Pandas and Polars for data processing Implement Spark Core, Spark SQL, and Spark Streaming Collaborate with teams and guide junior developers Handle performance tuning and troubleshooting Skills Required : Strong Python and Spark programming skills Experience with Pandas and Polars Knowledge of Hadoop ecosystem Good problem-solving and teamwork skills Interested candidates can share your CV to kalyan.v@talent21.in
Posted 1 week ago
4.0 - 10.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Overview About the job Do you have hands-on experience with data engineering, and data architecting? Are you familiar with metadata management and associated processes? We’re looking for an expert communicator with a strong customer orientation and object-oriented programming experience to join our Corporate Technology and Security Engineering team as a Senior Data Engineer. In this role, you’ll design, develop and implement data models, ETL pipelines and warehouses for our internal applications and system. Additionally, you will provide architectural assessments, strategies and roadmaps; verify performance, fault tolerance and security. If you’re craving an exciting new opportunity where you can partner with project managers and other business leaders to facilitate projects that make good use of your data insights, let’s chat! CIMS is a high-growth Software-as-a-Service (SaaS) company. We are the industry's premier recruitment software provider, delivering technology that supports more than 3,500 contracted customers around the globe. Committed to both growth and stability, we have a lot of opportunities for career advancement within our organization. Come grow with us—apply today! iCIMS is a high-growth Software-as-a-Service (SaaS) company headquartered in Holmdel, NJ. We are the industry’s #1 recruitment software provider, delivering technology that supports approximately 4,000 contracted customers around the globe. Dedicated to maintaining an inclusive, inspirational and innovative work environment, and committed to our consistent growth, we have a wide range of opportunity for career advancement within our organization. Come grow with us—apply today! Responsibilities Develops and delivers long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. Creates short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. Establishes methods and procedures for tracking data quality, completeness, redundancy, and improvement. Creates strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. Design, Develop and Support ETL pipelines Oversees the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. Collaborates with project managers and business unit leaders for all projects involving CRM and downstream data. Addresses data-related problems in regards to systems integration, compatibility, and multiple-platform integration. key components as needed to create testing criteria in order to guarantee the fidelity and performance of data analytics solutions. Implements data management processes, procedures, and decision support. Optimize and monitor data pipelines feeding data stores or repositories. Work with data governance, customer success and product reporting teams to build out advanced analytics and reporting dashboards leveraging tools such as tableau, kibana, etc. Research emerging trends and best of breed solutions for data modeling, data contextualization, and predictive analytics. Proficient understanding of distributed computing principles. Qualifications A minimum of 5 years relevant experience. Hands-on knowledge of data modeling, data profiling or data parsing. Experience in Azure Data Warehousing, Azure Data Factory, SSIS, SSAS, ETL Familiarity with metadata management and associated processes. Demonstrated expertise with repository creation, and data and information system life cycle methodologies. Experience with data processing flowcharting techniques. Ability to manage data and metadata migration. Programming experience with Python Expert in writing SQL and Stored Procedures Experience in SFDC, NetSuite, Adaptive , Concur APIs highly desirable. Knowledge of AWS, GCP and Big Data, Red Shift is desirable Experience with integration platforms such as workato is a plus Excellent client/user interaction skills to determine requirements. Strong customer orientation focus and success in creating a superior customer experience. Good knowledge of applicable data privacy practices and laws. Understanding of Web services (SOAP, XML, UDDI, WSDL) Experience in defining, classifying, and, maintenance of MDM across an evolving set of SaaS interfaces.
Posted 1 week ago
6.0 - 11.0 years
25 - 37 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Azure Expertise, Proven experience with Azure Cloud services especially Azure Data Factory, Azure SQL Database & Azure Databricks Expert in PySpark data processing & analytics Strong background in building and optimizing data pipelines and workflows. Required Candidate profile Solid exp with data modeling,ETL processes & data warehousing Performance Tuning Ability to optimize data pipelines & jobs to ensure scalability & performance troubleshooting & resolving performance
Posted 1 week ago
3.0 - 8.0 years
5 - 15 Lacs
Bengaluru
Work from Office
Job Title: Data Scientist Location: Bangalore, India Experience: 3+ Years Role & Responsibilities: We are looking for a highly motivated Data Scientist to join our analytics team in Bangalore. This role is ideal for individuals who possess strong machine learning and data engineering skills and are eager to solve complex problems in the financial technology space. You will work on developing predictive models, building scalable data solutions, and contributing to key decision-making processes. Design, build, and validate machine learning models including Logistic Regression, Random Forests, GBM, and Survival Models. Perform rigorous model evaluation using techniques such as K-Fold Cross Validation, Out-of-Time (OOT) validation, and X-Tab analysis. Utilize Python (via PyCharm/Jupiter), scikit-learn, PyTorch or TensorFlow for end-to-end model development. Analyse large datasets using SQL and Excel to derive actionable insights. Develop and deploy APIs to serve machine learning models in production environments. Apply statistical techniques such as mean/variance analysis, probability distributions, and simulations to drive model accuracy and relevance. (Optional) Work with lending-specific metrics such as vintage curves, roll forward tracking, and bounce rates to enhance financial models. Strong ML skills: Logistic, Random Forest, GBM, survival model, K fold test, OOT, X-Tab etc. Experience using scikit-learn, pytorch or tensor flow. Experience: 3+ Years Python and usage of PyCharm/ Jupiter, SQL, Excel. Well versed in API development - Strong mathematical ability. mean, variance, probability distributions, simulation etc. Understanding of lending metrics such as vintage curve, roll forward, bounces etc. (Good to have not mandatory) Engineering background preferred IIT/NIT plus not must. Prior experience in Lending Fintech preferred.
Posted 1 week ago
5.0 - 10.0 years
15 - 20 Lacs
Pune
Work from Office
AZURE DATA ENGINEER Skills - Strong technical experience in Azure, SQL , Azure data factory, ETL, Databricks Graduation must Experience- 5-10 years CTC- Up to 14 - 20 LPA 21st June -F2F Interview only (Pune) Contact- 7742324144
Posted 1 week ago
9.0 - 14.0 years
10 - 20 Lacs
Bengaluru
Hybrid
9+ years experience from a mix of scrum master, program manager, product owner/manager, business analyst, and like, positions in eCommerce . Experience in driving TEAM joint roadmap development and refinement resulting in plans that have been delivered via agile methodologies. Must has prior experience managing Data Engineering & Analytics teams. Prior Experience driving and delivering ETL programs with Microsoft Azure, Snowflake, PowerBI technology stack. Demonstrated ability to lead large-scale projects end-to-end, from planning to execution, while managing risk, scope, timelines, and resources. Proficiency in both Agile and Waterfall methodologies is expected. Knowledge of ETL processes, tools, and technologies, with hands-on experience in managing complex data integration projects. Experience working in agile development including tools like JIRA, Confluence and others. Strong communication skills, team management and ability to build relationships across cross-functional teams Digital expertise and experience• Develop and manage the technology cycle of strategic planning and delivery of customer value Nice to have first level scrum training (Certified Scrum Master) / Safe agile Demonstrated experience of instilling a continuous improvement culture
Posted 1 week ago
3.0 - 5.0 years
6 - 16 Lacs
Hyderabad
Remote
Job title : Data Engineer Level brand : 4+ yrs Preferred Shift : Remote (night shift PST zone) Notice Period : 1 month Mandatory Skills: Advanced proficiency with Azure Data Factory, Power BI, and other Azure data tools. Strong command of Excel for data analysis, including PivotTables, charts, and integration with Power BI datasets. Expertise in data scoring techniques and a demonstrated ability to learn and apply U-SQL. Proven ability to independently pull, analyze, and interpret large datasets to drive business decisions. Excellent data visualization skills and experience plotting data for diverse audiences. Strong communication skills for effective collaboration across technical and non-technical teams. Demonstrated eagerness to learn new technologies and adapt to evolving data landscapes. Ability to work autonomously, manage multiple priorities, and deliver high-quality results in a fast-paced environment. Roles & Responsibilities : Build, manage, and maintain robust data pipelines for integration, transformation, and delivery of data using Azure Data Factory (ADF) and related Azure data services. Oversee real-time monitoring, logging, and alerting for data pipelines to ensure data quality, reliability, and timely issue resolution. Develop, automate, and maintain business reporting and alerting systems using Power BI and other visualization tools. Independently extract, manipulate, and analyze data from various sources to generate insights and support ad hoc business requests, leveraging Excel, Power BI, and other analytical tools. Demonstrate advanced skills in data scoring, U-SQL, and the ability to quickly grasp new data query languages and technologies. Visualize and plot data effectively to communicate findings and trends to stakeholders. Continuously learn and adopt new data technologies, tools, and best practices to improve data processes and capabilities. Collaborate with cross-functional teams to understand data requirements, share insights, and support domain-specific needs. Ensure data security, compliance, and adherence to organizational standards throughout all data processes
Posted 1 week ago
15.0 - 20.0 years
17 - 22 Lacs
Hyderabad
Work from Office
About the Roe Director, Data Engineering Manager S&P Goba Ratings is seeking for an experienced eader to head our data engineering teams within the Data Services group, a coaborative team of data and technoogy professionas dedicated to shaping and executing the strategic data roadmap for S&P Goba Ratings. This position is based out of Hyderabad, India. The successfu candidate wi pay a key roe in designing and buiding our data engineering patforms, contributing to the design and depoyment of advanced engineering and machine earning soutions. We ook forward to wecoming a eader who can drive innovation and exceence within our teams. The Team Join the Rating Organizations Data Services Product Engineering Team, known for its expertise in critica data domains and technoogy stacks. This team vaues knowedge sharing, coaboration, and a unified strategy to buid S&P Ratings' next-gen anaytics patform. Members provide eadership, innovation, and articuate business vaue, contributing to a unique opportunity to evove the patform. Responsibiities and Impact: Lead and manage mutipe engineering teams across different time zones, ensuring effective coaboration and communication. Provide technica guidance and mentorship in data engineering and microservice architecture. Drive the deveopment and impementation of best practices in software engineering. Coaborate with cross-functiona teams to aign engineering efforts with business goas and objectives. Cutivate a positive team environment that encourages innovation, creativity, and open communication. Infuence stakehoders and team members to embrace new technoogies and methodoogies. Monitor team performance, providing constructive feedback and coaching. Stay abreast of industry trends and emerging technoogies to inform strategic decisions. Participate activey in a Agie scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technica design documents and conduct technica wakthroughs. Enhance team productivity and efficiency through effective eadership and mentorship. Drive innovation in engineering practices that aign with business objectives. Ensure high-quaity deiverabes that meet or exceed stakehoder expectations. Strengthen coaboration across goba teams, improving overa project outcomes. Foster a cuture of continuous earning and improvement across engineering teams. What Were Looking For: Basic Required Quaifications: Bacheor's degree in Computer Science, Information Systems or Engineering is requiredProficient with software deveopment ifecyce (SDLC) methodoogies ike Agie, Test-driven deveopment15+ years of experience with 8+ years designing enterprise products, modern data stacks and anaytics patforms8+ years of hands-on experience contributing to appication architecture & designs, proven software/enterprise integration design patterns and fu-stack knowedge incuding modern distributed front end and back-end technoogy stacks8+ years fu stack deveopment experience in modern web deveopment technoogies, Java/J2EE, UI frameworks ike Anguar, React, data ake systems ike Databricks using AWS coud technoogies and PySpark, SQL, Orace, NoSQL Databases ike MongoDBThorough understanding of distributed computingExperience designing transactiona/data warehouse/data ake and data integrations with big data eco system everaging AWS coud technoogiesPassionate, smart, and articuate deveoperExp. with frameworks such as Anguar, React JS, Durandajs, Knockoutjs, React and Bootstrap.jsQuaity first mindset with a strong background and experience with deveoping products for a goba audience at scaeExceent anaytica thinking, interpersona, ora and written communication skis with strong abiity to infuence both IT and business partnersSuperior knowedge of system architecture, object-oriented design, and design patterns.Good work ethic, sef-starter, and resuts-orientedExceent communication skis are essentia, with strong verba and writing proficiencies Additiona Preferred Quaifications: Experience working AWSExperience with SAFe Agie FrameworkBacheor's/PG degree in Computer Science, Information Systems or equivaent.Hands-on experience contributing to appication architecture & designs, proven software/enterprise integration design principesAbiity to prioritize and manage work to critica project timeines in a fast-paced environmentExceent Anaytica and communication skis are essentia, with strong verba and writing proficienciesAbiity to train and mentor About S&P Goba Ratings At S&P Goba Ratings, our anayst-driven credit ratings, research, and sustainabe finance opinions provide critica insights that are essentia to transating compexity into carity so market participants can uncover opportunities and make decisions with conviction. By bringing transparency to the market through high-quaity independent opinions on creditworthiness, we enabe growth across a wide variety of organizations, incuding businesses, governments, and institutions. S&P Goba Ratings is a division of S&P Goba (NYSESPGI). S&P Goba is the words foremost provider of credit ratings, benchmarks, anaytics and workfow soutions in the goba capita, commodity and automotive markets. With every one of our offerings, we hep many of the words eading organizations navigate the economic andscape so they can pan for tomorrow, today.For more information, visit Whats In It For You Our Purpose: Progress is not a sef-starter. It requires a catayst to be set in motion. Information, imagination, peope, technoogythe right combination can unock possibiity and change the word.Our word is in transition and getting more compex by the day. We push past expected observations and seek out new eves of understanding so that we can hep companies, governments and individuas make an impact on tomorrow. At S&P Goba we transform data into Essentia Inteigence, pinpointing risks and opening possibiities. We Acceerate Progress. Our Peope: Our Vaues: Integrity, Discovery, Partnership At S&P Goba, we focus on Powering Goba Markets. Throughout our history, the word's eading organizations have reied on us for the Essentia Inteigence they need to make confident decisions about the road ahead. We start with a foundation of integrity in a we do, bring a spirit of discovery to our work, and coaborate in cose partnership with each other and our customers to achieve shared goas.
Posted 1 week ago
10.0 - 12.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Bachelor's degree in computer science, Information Technology, or related field. Familiarity with cloud databases and full-stack technologies (such as .NET, C#). 10-12 years of extensive experience in SQL Server DBA, data modeling, management and integration services Data migration, complex SPROCs and performant database functions,ETL (Extract Transform Load), and integration services, including SSIS. Data analysis, data quality, and proficient data modeling. Test automation tools and techniques for SQL databases. Demonstrated experience in applying DevSecOps best practices within an Agile software development environment Collaborate effectively with cross-functional teams in an Agile environment, utilizing tools like Jira, Confluence, and Gliffy for documentation and workflows. Design logical and physical data models to support application and reporting needs. Collaborate with business analysts and developers to understand data requirements. Create and maintain conceptual, logical, and physical data models. Evaluate and recommend improvements to data architecture and flows. Monitor database performance and implement tuning measures for optimization. Expertise in database platforms such as: Microsoft SQL ServerOracle DB/PostgreSQL MySQL/cloudDBs.
Posted 1 week ago
5.0 - 9.0 years
6 - 9 Lacs
Hyderabad
Work from Office
Minimum Education Required Minimum Experience Required Should have acquired Bachelors Degree in Technology /Engineering or Masters in Computer Applications (MCA) or Master of Science in Computer Science from a reputed institute. Masters in Technology/Engineering preferable Following are the skill set required for team lead -Emerging technologies Technical Skills Programming skills: Proficient in Python with libraries like TensorFlow, PyTorch, Scikit-learn. Understanding of Machine learning algorithms and neural networks. Working with computer vision models and large language models (NLP). Data Engineering: Knowledge of data cleaning, pre-processing, feature engineering, and data pipelines. Cloud Computing: Familiarity with cloud platforms like AWS, Azure, or GCP for deploying AI models Leadership Skills , Problem solving & learning skills Good communication and listening skills. Team building, defining project goals, prioritizing tasks, guiding team members, sharing knowledge and best practices. Identifying potential issues in AI models, evaluating data quality, and troubleshooting technical challenges. Keeping pace with rapid advancements in the AI field, actively researching new techniques and tools Roles and Responsibilities About the Role: The Emerging Technologies-Team Lead will be responsible for driving innovation and implementation of cutting-edge technologies within the organization. This role requires strategic thinking and the ability to translate emerging technology trends into impactful solutions that enhance governance. The leader will oversee projects, manage resources, and ensure alignment with the organization’s goals. About the Team: The team consists of a diverse group of professionals with expertise in various emerging technologies, including AI, blockchain, and data analytics. Collaboration and knowledge sharing are core principles within the team, promoting a culture of continuous learning and improvement. The team is dedicated to researching and deploying innovative solutions that can improve public governance and service delivery. You are Responsible for: Leading the development and execution of technology projects from conception to deployment. Building and maintaining relationships with stakeholders to identify technology needs and opportunities. Mentoring and guiding team members in their professional growth and technical skills. Staying updated on the latest trends in technology and assessing their potential impact on governance strategies. To succeed in this role – you should have the following: Proven experience in leading technology teams and managing complex projects. Strong understanding of emerging technologies and their applications in governance. Excellent communication and interpersonal skills to collaborate effectively with various stakeholders. A forward-thinking mindset and the ability to adapt to changes in technology and governance landscapes.
Posted 1 week ago
7.0 - 12.0 years
13 - 17 Lacs
Noida
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Work with large, diverse datasets to deliver predictive and prescriptive analytics Develop innovative solutions using data modeling, machine learning, and statistical analysis Design, build, and evaluate predictive and prescriptive models and algorithms Use tools like SQL, Python, R, and Hadoop for data analysis and interpretation Solve complex problems using data-driven approaches Collaborate with cross-functional teams to align data science solutions with business goals Lead AI/ML project execution to deliver measurable business value Ensure data governance and maintain reusable platforms and tools Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Technical Skills Programming Languages: Python, R, SQL Machine Learning Tools: TensorFlow, PyTorch, scikit-learn Big Data Technologies: Hadoop, Spark Visualization Tools: Tableau, Power BI Cloud Platforms: AWS, Azure, Google Cloud Data Engineering: Talend, Data Bricks, Snowflake, Data Factory Statistical Software: R, Python libraries Version Control: Git Preferred Qualifications: Masters or PhD in Data Science, Computer Science, Statistics, or related field Certifications in data science or machine learning 7+ years of experience in a senior data science role with enterprise-scale impact Experience managing AI/ML projects end-to-end Solid communication skills for technical and non-technical audiences Demonstrated problem-solving and analytical thinking Business acumen to align data science with strategic goals Knowledge of data governance and quality standards At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone of every race, gender, sexuality, age, location and income deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission. #Nic
Posted 1 week ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Engineering Manager to lead a team building data pipelines, models, and analytics infrastructure. Ideal for experienced engineers who can manage both technical delivery and team growth. Key Responsibilities: Lead development of ETL/ELT pipelines and data platforms Manage data engineers and collaborate with analytics/data science teams Architect systems for data ingestion, quality, and warehousing Define best practices for data architecture, testing, and monitoring Required Skills & Qualifications: Strong experience with big data tools (Spark, Kafka, Airflow) Proficiency in SQL, Python, and cloud data services (e.g., Redshift, BigQuery) Proven leadership and team management in data engineering contexts Bonus: Experience with real-time streaming and ML pipeline integration Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 week ago
10.0 - 15.0 years
12 - 18 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description Develop scalable data collection, storage, and distribution platform to house data from vendors, research providers, exchanges, PBs, and web-scraping. Make data available to systematic & fundamental PMs, and enterprise functions: Ops, Risk, Trading, and Compliance. Develop internal data products and analytics Responsibilities Web scraping using scripts/APIs/Tools Help build and maintain greenfield data platform running on Snowflake and AWS Understand the existing pipelines and enhance pipelines for the new requirements. Onboarding new data providers Data migration projects Skills Must have 10+ years of exp as Data Engineer SQL Python Linux Containerization(Docker, Kubernetes) Good communication skills AWS Strong on Dev ops side of things(K8s, Docker, Jenkins) Being ready to work in EU time zone Nice to have Market Data Projects/ Capital markets exp Snowflake is a big plus Airflow Other Languages English: B2 Upper Intermediate Location - Pune,Bangalore,Hyderabad,Chennai,Noida
Posted 1 week ago
7.0 - 12.0 years
14 - 20 Lacs
Chennai
Work from Office
Senior Data Engineer As a Senior Data Engineer, you will play a crucial role in designing, developing, and maintaining the data architecture and infrastructure necessary for efficient and scalable data processing. Leveraging your expertise in data technologies and software engineering, you will collaborate with cross-functional teams to ensure the reliability, performance, and accessibility of our data systems. Your responsibilities will encompass everything from data modeling and ETL processes to optimizing data pipelines and implementing best practices for data governance and security. Key Responsibilities: Data Architecture Design: Design, develop, and maintain scalable data architectures that support the storage, processing, and analysis of large volumes of data. Data Pipeline Development: Develop robust and efficient ETL processes and data pipelines to ingest, transform, and load data from various sources into the data warehouse or other storage systems. Data Modeling: Design and implement data models that facilitate efficient querying and analysis, ensuring data integrity, accuracy, and consistency. Performance Optimization: Optimize database queries & data processing workflows to improve performance, scalability, and reliability. Monitoring and Maintenance: Monitor data pipelines and systems for performance issues and troubleshoot problems as they arise and ensure the reliability and availability of data systems. Data Governance and Security: Knowledge on data governance policies and security measures to ensure compliance with regulatory requirements and protect sensitive data is a plus. Tool Evaluation and Implementation: Evaluate new technologies and tools to enhance the data infrastructure and recommend appropriate solutions based on business needs and technical requirements. Documentation: Document data architecture, processes, and workflows to facilitate knowledge sharing and ensure the sustainability of data solutions. Mentorship and Leadership: Provide guidance and mentorship to junior team members, sharing best practices, and fostering a culture of continuous learning and improvement. Required Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proven total experience (8-10 years) and at least 7 years in a similar role as a data engineer, with a focus on designing and implementing scalable data solutions. Proficiency in programming languages such as Python, or Scala, T-SQL and experience with data processing frameworks like Apache Spark Strong SQL skills and experience with relational databases (MSSQL, MySQL, etc.) Experience with AWS cloud platform and proficiency in cloud-based data services - AWS Redshift & Databricks is a must. Expertise in ETL tools like SSIS. Working knowledge on cloud ELT services like AWS Glue, Azure Data Factory is a plus. Working knowledge on non-relational/NoSQL databases is a plus Knowledge of data warehousing concepts, dimensional modeling, and data warehouse design patterns. Familiarity with DevOps practices and tools for continuous integration and deployment (CI/CD). Excellent problem-solving skills and the ability to analyze complex data systems and workflows. Strong communication skills and the ability to collaborate effectively with cross-functional teams. Experience with big data technologies such as Hadoop, Apache Kafka, or Apache HBase is a plus. Can start ASAP or within 30 days. Education / Certifications: Bachelor's/College Degree in Engineering (Computer/Telecommunication), Computer Science/Information Technology, or equivalent. Work Location / Work Schedule / Travel: Chennai TaskUs Office Day Shift Schedule Onsite Setup How We Partner To Protect You: TaskUs will neither solicit money from you during your application process nor require any form of payment in order to proceed with your application. Kindly ensure that you are always in communication with only authorized recruiters of TaskUs. DEI: In TaskUs we believe that innovation and higher performance are brought by people from all walks of life. We welcome applicants of different backgrounds, demographics, and circumstances. Inclusive and equitable practices are our responsibility as a business. TaskUs is committed to providing equal access to opportunities. If you need reasonable accommodations in any part of the hiring process, please let us know. We invite you to explore all TaskUs career opportunities and apply through the provided URL https://www.taskus.com/careers/ .
Posted 1 week ago
6.0 - 9.0 years
10 - 24 Lacs
Gurugram
Work from Office
Responsibilities: * Design, develop & maintain data pipelines using Snowflake, AWS/GCP * Collaborate with cross-functional teams on ETL processes & data modeling
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Noida
Work from Office
Role & responsibilities • Bachelors degree in computer science, Engineering, or related field. • Proven experience as a Data Engineer, with a minimum of 2 years of hands-on experience working with Azure Data Factory. • Strong proficiency in SQL and experience with relational databases (e.g., SQL Server, Azure SQL Database). • Solid understanding of data modeling concepts and ETL principles. • Experience with cloud-based data technologies, specifically Microsoft Azure (Azure Data Lake Storage, Azure SQL Data Warehouse, etc.). • Familiarity with data orchestration and workflow scheduling tools (e.g., Azure Data Factory). • Knowledge of programming languages such as Python is a plus. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. • Azure certifications (e.g., Azure Data Engineer, Azure Developer) are desirable but not required.
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications. 1. Applies scientific methods to analyse and solve software engineering problems. 2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance. 3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers. 4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities. 5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Job Description - Grade Specific Has more than a year of relevant work experience. Solid understanding of programming concepts, software design and software development principles. Consistently works to direction with minimal supervision, producing accurate and reliable results. Individuals are expected to be able to work on a range of tasks and problems, demonstrating their ability to apply their skills and knowledge. Organises own time to deliver against tasks set by others with a mid term horizon. Works co-operatively with others to achieve team goals and has a direct and positive impact on project performance and make decisions based on their understanding of the situation, not just the rules. Skills (competencies) Verbal Communication
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Gurugram
Work from Office
Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications. 1. Applies scientific methods to analyse and solve software engineering problems. 2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance. 3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers. 4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities. 5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Job Description - Grade Specific Has more than a year of relevant work experience. Solid understanding of programming concepts, software design and software development principles. Consistently works to direction with minimal supervision, producing accurate and reliable results. Individuals are expected to be able to work on a range of tasks and problems, demonstrating their ability to apply their skills and knowledge. Organises own time to deliver against tasks set by others with a mid term horizon. Works co-operatively with others to achieve team goals and has a direct and positive impact on project performance and make decisions based on their understanding of the situation, not just the rules.
Posted 1 week ago
3.0 - 7.0 years
11 - 15 Lacs
Bengaluru
Work from Office
A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific An expert on the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Team Leadership and Management: Supervising a team of platform engineers, with a focus on team dynamics and the efficient delivery of cloud platform solutions.Technical Guidance and Decision-Making: Providing technical leadership and making pivotal decisions concerning platform architecture, tools, and processes. Balancing hands-on involvement with strategic oversight.Mentorship and Skill Development: Guiding team members through mentorship, enhancing their technical proficiencies, and nurturing a culture of continual learning and innovation in platform engineering practices.In-Depth Technical Proficiency: Possessing a comprehensive understanding of platform engineering principles and practices, and demonstrating expertise in crucial technical areas such as cloud services, automation, and system architecture.Community Contribution: Making significant contributions to the development of the platform engineering community, staying informed about emerging trends, and applying this knowledge to drive enhancements in capability.
Posted 1 week ago
3.0 - 7.0 years
11 - 15 Lacs
Bengaluru
Work from Office
A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A strong grasp of the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Community Engagement: Actively participating in the professional data platform engineering community, sharing insights, and staying up-to-date with the latest trends and best practices.Project Contributions: Making substantial contributions to client delivery, particularly in the design, construction, and maintenance of cloud-based data platforms and infrastructure.Technical Expertise: Demonstrating a sound understanding of data platform engineering principles and knowledge in areas such as cloud data storage solutions (e.g., AWS S3, Azure Data Lake), data processing frameworks (e.g., Apache Spark), and data orchestration tools.Independent Work and Initiative: Taking ownership of independent tasks, displaying initiative and problem-solving skills when confronted with intricate data platform engineering challenges.Emerging Leadership: Commencing leadership roles, which may encompass mentoring junior engineers, leading smaller project teams, or taking the lead on specific aspects of data platform projects.
Posted 1 week ago
2.0 - 5.0 years
3 - 7 Lacs
Ahmedabad
Work from Office
Roles and Responsibility : Develop expertise in the different upstream data stores and systems. Design, develop and maintain data integration pipelines for growing data sets and product offerings. Build testing and QA plans for data pipelines. Skillful in ETL Data Engineering. Build data validation testing frameworks to ensure high data quality and integrity. Write and maintain documentation on data pipelines and schemas Extensive experience in data integration tools to analyse root cause and provide a fix for production and development issues. Good in Understanding of Hive, Spark, Hadoop architecture and optimization.
Posted 1 week ago
5.0 - 10.0 years
5 - 15 Lacs
Hyderabad
Work from Office
Job Description: We are seeking a talented and experienced Data Scientist to join our dynamic team. The ideal candidate will have a strong background in data analysis, machine learning, statistical modeling, and artificial intelligence. Experience with Natural Language Processing (NLP) is desirable. Experience delivering products that incorporate AI/ML, familiarity with Cloud Services such as AWS highly desirable. Key Responsibilities: Clean, prepare, and explore data to find trends and patterns Build, validate, and implement AI/ML models Extensively document all aspects of the work including data analysis, model development, results Collaborate with other team members teams to incorporate AI/ML models into software applications Stay updated with the latest advancements in AI/ML domain and incorporate into day-to-day work Required Skills/Qualifications: 3-5 years of experience in AI/ML related work Extensive experience in Python Familiarity with Statistical models such as Linear/Logistic regression, Bayesian Models, Classification/Clustering models, Time Series analysis Experience with deep learning models such as CNNs, RNNs, LSTM, Transformers Experience with machine learning frameworks such as TensorFlow, PyTorch, Scikit- learn, Keras Experience with GenAI, LLMs, RAG architecture would be a plus Familiarity with cloud services such as AWS, Azure Familiarity with version control systems (e.g., Git), JIRA, Confluence Familiarity with MLOPs concepts, AI/ML pipeline tooling such as Kedro Knowledge of CI/CD pipelines and DevOps practices Experience delivering customer facing AI Solutions delivered as SaaS would be a plus Bachelors degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Strong problem-solving skills and attention to detail Excellent verbal and written communication and teamwork skills Benefits: Competitive salary and benefits package Opportunity to work on cutting-edge technologies and innovative projects Collaborative and inclusive work environment Professional development and growth opportunities
Posted 1 week ago
5.0 - 8.0 years
15 - 30 Lacs
Chennai, Bengaluru
Hybrid
ComfortableJob Description: We are seeking an experienced Data Engineer to maintain and optimize our on-premises data warehouse environment. This role involves day-to-day support of production systems, ownership of ETL pipelines, and delivery of automated operational reports. The ideal candidate will have deep expertise in traditional data warehousing methodologies, SQL, and be comfortable working in Client facing environments. Job Responsibility: Data Warehousing & ETL Proven experience with on-premises data warehousing solutions (e.g., MSSQL Server, Oracle). Hands-on expertise in ETL tools such as SSIS, Informatica, or similar. SQL & Database Management Strong T-SQL skills for query development and optimization. Experience with creating and maintaining stored procedures, triggers, and complex queries. Understanding of database concepts like indexing, partitioning, and locking. Production Environment Experience Comfortable working in a high-pressure environment with strict SLA requirements. Proven track record in handling production issues, performing root-cause analysis, and implementing solutions. Reporting & BI Tools Knowledge of operational reporting workflows; ability to manage automated reporting pipelines. Familiarity with reporting tools (e.g., SSRS, Crystal Reports, Power BI) for troubleshooting and support, though the primary focus is not dashboard creation. Problem-Solving & Troubleshooting Ability to quickly diagnose data issues in complex data flows. Strong analytical skills for performance tuning and error resolution. Communication & Collaboration Classification | INTERNAL Excellent verbal and written communication skills for coordinating with business stakeholders. Ability to work cross-functionally with multiple teams, including analysts, developers, and IT operations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.
Posted 1 week ago
3.0 - 8.0 years
2 - 7 Lacs
Noida, New Delhi, Greater Noida
Work from Office
About Info Edge Info Edge mission is to create world-class platforms that transform lives by continuously innovating. Our products and services are built keeping our customers in mind. We always delight our customers by delivering superior value through enhanced offerings on the internet and other platforms. Through our continuous investment across various businesses, especially in cutting-edge technology, machine learning and artificial intelligence (AI), we have built a robust system that constantly increases our predictive powers on customer behaviour and optimizes and improves our systems. Our various teams tirelessly work together to solve problems, innovate, and create something to empower our customers. At Info Edge, people are our core competitive advantage, and we will continue doing all that is needed to attract and retain the best available talent. About BU: Naukri.com Naukri is India's market leader in the recruitment business. It provides all the job seekers with advisory services and caters to their different needs and offer value-added features such as resume writing, highlighting and many more. With over 67 Million resumes searches daily, Naukri.com has 5 Million job listings, 59 Thousand+ unique clients and 4.9 Million recruiters connect with the job seekers via emails. Naukri eHire is like an extension to our Recruitment team and whenever there is a spurt in our requirements, instead of giving it to consultants at a high cost we use e-hire services to get shortlisted and validated CVs. Job Description: - Managing the entire Talent Acquisition cycle right from understanding the manpower requirement, sourcing candidates, interviewing candidates, Negotiating offers and closing the position Expertise in recruiting all the entry-level, middle-level and senior-level positions for Tech and Non-Tech requirements. Working closely with Business Managers to ensure an in-depth understand of the hiring mandate and create impactful job descriptions. Depending on the desired candidate profile, effectively source candidates from varied sources, such as job portals, campus hiring, walk-ins, head hunting, internal referrals etc. Achieving the monthly/ quarterly and annual hiring targets to achieve the manpower projections. Adhering to internal HR processes such as ensuring proper documentation, monthly hiring MIS generation, preparing offer letters, within the specific TAT. Maintaining, documented and presented progress reports to leaders. Desired Candidate Profile Excellent communication skills, Innovative, goal-driven, Aggressive. Should have client management experience. Fast learner capable of handling pressure. Good recruitment skills. Graduate or Postgraduate both are eligible. Required Skills: Excellent verbal and written communication Well versed with the Naukri portal for recruitment activities In depth knowledge of HR principles, functions and practices Experience of hiring for middle level to senior level positions ONLY CONSULTING EXP REQUIRED WhatsApp for more info 9313787329
Posted 1 week ago
2.0 - 6.0 years
0 - 1 Lacs
Pune
Work from Office
As Lead ML Engineer , you'll lead the development of predictive models for demand forecasting, customer segmentation, and retail optimization, from feature engineering through deployment. As Lead ML Engineer, you'll lead the development of predictive models for demand forecasting, customer segmentation, and retail optimization, from feature engineering through deployment. Responsibilities: Build and deploy models for forecasting and optimization Perform time-series analysis, classification, and regression Monitor model performance and integrate feedback loops Use AWS SageMaker, MLflow, and explainability tools (e.g., SHAP or LIME)
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane