Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Manager Software Engineer Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Data & Services The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Targeting Analytics Program Within the D&S Technology Team, the Targeting Analytics program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Credit Risk, Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are seeking an innovative Lead Software Engineer to lead our team in designing and building a full stack web application and data pipelines. The goal is to deliver custom analytics efficiently, leveraging machine learning and AI solutions. This individual will thrive in a fast-paced, agile environment and partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Audiences uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Credit Risk products are a new suite of APIs and tooling to provide lenders real-time access to KPIs and insights serving thousands of clients to make smarter risk decisions using Mastercard data. Help found a new, fast-growing engineering team! Position Responsibilities As a Lead Software Engineer, you will: Lead the scoping, design and implementation of complex features Lead and push the boundaries of analytics and powerful, scalable applications Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior software engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 10+ years of engineering experience in an agile production environment. Experience leading the design and implementation of complex features in full-stack applications. Proficiency with object-oriented languages, preferably Java/ Spring. Proficiency with modern front-end frameworks, preferably React with Redux, Typescript. High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop) Fluent in the use of Git, Jenkins. Solid experience with RESTful APIs and JSON/SOAP based API Solid experience with SQL, Multi-threading, Message Queuing. Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases. Desirable Capabilities Hands on experience of cloud native development using microservices. Hands on experience on Kafka, Zookeeper. Knowledge of Security concepts and protocol in enterprise application. Expertise with automated E2E and unit testing frameworks. Knowledge of Splunk or other alerting and monitoring solutions. Core Competencies Strong technologist eager to learn new technologies and frameworks. Experience coaching and mentoring junior teammates. Customer-centric development approach Passion for analytical / quantitative problem solving Ability to identify and implement improvements to team development processes Strong collaboration skills with experience collaborating across many people, roles, and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills #AI3 Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
4.0 - 7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Roles and ResponsibilitiesAssistant Managers are expected to understand client objectives and collaborate with the Project Lead to design appropriate analytical solutions. They should be able to translate business goals into structured deliverables with defined priorities and constraints. The role involves managing, organizing, and preparing data, conducting quality checks, and ensuring readiness for analysis.They should be proficient in applying statistical and machine learning techniques such as regression (linear/non-linear), decision trees, segmentation, time series forecasting, and algorithms like Random Forest, SVM, and ANN. Sanity checks and rigorous self-QC of all outputs, including work from junior analysts, are essential to ensure accuracy.Interpretation of results in the context of the client’s industry is necessary to generate meaningful insights. Assistant Managers should be comfortable handling client calls independently and coordinating regularly with onsite leads when applicable. They should be able to discuss specific deliverables or queries over calls or video conferences.They must manage projects from initiation through closure, ensuring timely and within-budget delivery. This includes collaborating with stakeholders to refine business needs and convert them into technical specifications, managing data teams, conducting performance evaluations, and ensuring high data quality. Effective communication between technical and business stakeholders is key to aligning expectations. Continuous improvement of analytics processes and methodologies is encouraged. The role also involves leading cross-functional teams and overseeing project timelines and deliverables.Client ManagementAssistant Managers will act as the primary point of contact for clients, maintaining strong relationships and making key decisions independently. They will participate in discussions on deliverables and guide project teams on next steps and solution approaches.Technical RequirementsCandidates must have hands-on experience connecting databases with Knime (e.g., Snowflake, SQL DB) and working with SQL concepts such as joins and unions. They should be able to read from and write to databases, utilize macros to automate tasks, and enable schedulers to run workflows. The ability to design and build ETL workflows and datasets in Knime for BI reporting tools is crucial. They must perform end-to-end data validation and maintain documentation supporting BI reports.They should be experienced in developing interactive dashboards and reports using PowerBI and leading analytics projects using PowerBI, Python, and SQL. Presenting insights clearly through PowerPoint or BI dashboards (e.g., Tableau, Qlikview) is also expected.Ideal CandidateThe ideal candidate will have 4 to 7 years of relevant experience in advanced analytics for Marketing, CRM, or Pricing within Retail or CPG; other B2C sectors may also be considered. Experience in managing and analyzing large datasets using Python, R, or SAS is required, along with the use of multiple analytics and machine learning techniques.They should be able to manage client communications independently and understand consumer-facing industries such as Retail, CPG, or Telecom. Familiarity with handling various data formats (flat files, RDBMS) and platforms (Knime, SQL Server, Teradata, Hadoop, Spark) in both on-premise and cloud environments is expected. A solid foundation in advanced statistical techniques such as regressions, decision trees, clustering, forecasting (ARIMA/X), and machine learning is essential.Other SkillsStrong verbal and written communication is a must. The candidate should be able to deliver client-ready outputs using Excel and PowerPoint. Knowledge of optimization techniques (linear/non-linear), supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview is a plus. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools
Posted 1 week ago
7.0 - 12.0 years
13 - 17 Lacs
Noida
Work from Office
Primary Responsibilities Application DevelopmentBuilding cloud-based applications, especially on Azure, ensuring seamless integration and connectivity of resources Technical SolutionsProviding different solutions and connecting various resources without issues Working with stakeholders to understand the modifications they wish to apply to their current systems Analyzing current systems to find flaws that can jeopardize cloud security Uploading business information to a cloud computing platform and setting up simple retrieval mechanisms for data Staying updated on advancements in the field of cloud computing to advise businesses and clients on industry best practices Increasing cloud storage capacity to store more files and crucial corporate data Ensuring the protection of data in computer systems by working with cybersecurity and IT staff Troubleshooting issues pertaining to cloud application failure or security flaws Creating and installing cloud computing solutions in accordance with client or employer requirements Automating specific system operations to enhance efficiency and speed Testing designs to find and fix mistakes and make system improvements Evaluating and identifying the best cloud solutions in collaboration with engineering and development teams Developing, establishing and implementing modular cloud-based applications Locating, evaluating and fixing infrastructure risks and deployment problems Periodically evaluating computer systems and offering suggestions for performance enhancements Offering support and guidance to meet customer requirements Managing ServiceNow tickets assigned to the team Maintaining and managing Linux servers, including OS and application patching and upgrades Be able to write automation scripting using bash, Python Analyzes and investigates Provides explanations and interpretations within area of expertise Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 7+ years of cloud development, troubleshooting enterprise level 7+ years of demonstrated IT experience supporting and troubleshooting enterprise level, mission-critical applications resolving complex issues/situations and driving technical resolution across cross-functional organizations Experience in QA or Testing concepts and processes Hands-on experience in setting up and managing production environments Familiarity with CI/CD pipeline tools (Jenkins, Bamboo, Maven/Gradle, Sonarqube, Git) for deployment process Proven solid troubleshooting & debugging skills (security, monitoring, server load, networking) Proven analytical skills Proven ability to build and run SQL queries to extract data and compare reports and compare data across multiple reports for consistency Preferred Qualifications Experience in Data Platform and BigdataSQL IaaS, Azure SQL DB, Cosmos DB, PostgreSQL HDInsight/Hadoop, Compute, Storage, Networking, High Availability HealthCare domain knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #Nic External Candidate Application Internal Employee Application
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
We are looking for an enthusiastic Data Scientist to join our team based in Bangalore. You will be pivotal in developing, deploying, and optimizing recommendation models that significantly enhance user experience and engagement. Your work will directly influence how customers interact with products, driving personalization and conversion. Responsibilities Model Development: Design, build, and fine-tune machine learning models focused on personalized recommendations to boost user engagement and retention. Data Analysis: Perform a comprehensive analysis of user behavior, interactions, and purchasing patterns to generate actionable insights. Algorithm Optimization: Continuously improve recommendation algorithms by experimenting with new techniques and leveraging state-of-the-art methodologies. Deployment and Monitoring: Deploy machine learning models into production environments, and develop tools for continuous performance monitoring and optimization. Requirements Education level: Bachelor's degree (B. E. / B. Tech) in Computer Science or equivalent from a reputed institute. Technical Expertise Strong foundation in Statistics, Probability, and core Machine Learning concepts. Hands-on experience developing recommendation algorithms, including collaborative filtering, content-based filtering, matrix factorization, or deep learning approaches. Proficiency in Python and associated libraries (NumPy, Pandas, Scikit-Learn, PySpark). Experience with TensorFlow or PyTorch frameworks and familiarity with recommendation system libraries (e. g., torch-rec). Solid understanding of Big Data technologies and tools (Hadoop, Spark, SQL). Familiarity with the full Data Science lifecycle from data collection and preprocessing to model deployment. This job was posted by Rituza Rani from Oneture Technologies.
Posted 1 week ago
4.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Seeking a highly skilled and analytical Data Scientist / Machine Learning Engineer to join our team. The ideal candidate will have a strong foundation in data science , machine learning , and statistical analysis , with hands-on experience in building scalable models and working with large datasets in cloud-based environments. Key Responsibilities: Design, develop, and deploy machine learning models for real-world applications using Python , scikit-learn , TensorFlow , or PyTorch . Perform data cleaning , wrangling , transformation , and exploratory data analysis (EDA) using pandas and other Python libraries. Apply statistical techniques such as hypothesis testing , regression analysis , ANOVA , and probability theory to derive insights. Implement supervised and unsupervised learning algorithms , conduct feature engineering , model evaluation , and hyperparameter tuning . Develop data visualizations using Matplotlib , Seaborn , or ggplot to communicate findings effectively. Work with big data technologies such as Apache Hadoop , Apache Spark , or distributed databases for large-scale data processing. Write optimized SQL queries and manage relational databases for data extraction and transformation. Contribute to data engineering pipelines , including data integration , warehousing , and architecture design . Utilize cloud platforms like Microsoft Azure , including Azure Machine Learning , for scalable model deployment. Collaborate using version control systems like Git for code management and team coordination. (Optional/Desirable) Experience with YOLO or other deep learning models for object detection. Qualifications: Bachelors or Masters degree in Computer Science , Data Science , Statistics , Engineering , or a related field. 48 years of experience in data science , machine learning , or AI development . Strong programming skills in Python and experience with data science libraries . Solid understanding of statistical methods , machine learning algorithms , and data modeling . Experience with cloud computing , big data tools , and database management . Preferred Skills: Experience with YOLO , OpenCV , or computer vision frameworks. Familiarity with CI/CD pipelines for ML model deployment. Knowledge of MLOps practices and tools. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders.
Posted 1 week ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Roles and ResponsibilitiesAssistant Managers must understand client objectives and collaborate with the Project Lead to design effective analytical frameworks. They should translate requirements into clear deliverables with defined priorities and constraints. Responsibilities include managing data preparation, performing quality checks, and ensuring analysis readiness. They should implement analytical techniques and machine learning methods such as regression, decision trees, segmentation, forecasting, and algorithms like Random Forest, SVM, and ANN.They are expected to perform sanity checks and quality control of their own work as well as that of junior analysts to ensure accuracy. The ability to interpret results in a business context and identify actionable insights is critical. Assistant Managers should handle client communications independently and interact with onsite leads, discussing deliverables and addressing queries over calls or video conferences.They are responsible for managing the entire project lifecycle from initiation to delivery, ensuring timelines and budgets are met. This includes translating business requirements into technical specifications, managing data teams, ensuring data integrity, and facilitating clear communication between business and technical stakeholders. They should lead process improvements in analytics and act as project leads for cross-functional coordination.Client ManagementThey serve as client leads, maintaining strong relationships and making key decisions. They participate in deliverable discussions and guide project teams on next steps and execution strategy.Technical RequirementsAssistant Managers must know how to connect databases with Knime (e.g., Snowflake, SQL) and understand SQL concepts such as joins and unions. They should be able to read/write data to and from databases and use macros and schedulers to automate workflows. They must design and manage Knime ETL workflows to support BI tools and ensure end-to-end data validation and documentation.Proficiency in PowerBI is required for building dashboards and supporting data-driven decision-making. They must be capable of leading analytics projects using PowerBI, Python, and SQL to generate insights. Visualizing key findings using PowerPoint or BI tools like Tableau or Qlikview is essential.Ideal CandidateCandidates should have 4–7 years of experience in advanced analytics across Marketing, CRM, or Pricing in Retail or CPG. Experience in other B2C domains is acceptable. They must be skilled in handling large datasets using Python, R, or SAS and have worked with multiple analytics or machine learning techniques. Comfort with client interactions and working independently is expected, along with a good understanding of consumer sectors such as Retail, CPG, or Telecom.They should have experience with various data formats and platforms including flat files, RDBMS, Knime workflows and server, SQL Server, Teradata, Hadoop, and Spark—on-prem or in the cloud. Basic knowledge of statistical and machine learning techniques like regression, clustering, decision trees, forecasting (e.g., ARIMA), and other ML models is required.Other SkillsStrong written and verbal communication is essential. They should be capable of creating client-ready deliverables using Excel and PowerPoint. Knowledge of optimization methods, supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview will be an added advantage. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary: We are looking for a skilled and experienced Data Engineer with over 5 years of experience in data engineering and data migration projects. The ideal candidate should possess strong expertise in SQL, Python, data modeling, data warehousing, and ETL pipeline development. Experience with big data tools like Hadoop and Spark, along with AWS services such as Redshift, S3, Glue, EMR, and Lambda, is essential. This role provides an excellent opportunity to work on large-scale data solutions, enabling data-driven decision-making and operational excellence. Key Responsibilities: • Design, build, and maintain scalable data pipelines and ETL processes. • Develop and optimize data models and data warehouse architectures. • Implement and manage big data technologies and cloud-based data solutions. • Perform data migration, data transformation, and integration from multiple sources. • Collaborate with data scientists, analysts, and business teams to understand data needs and deliver solutions. • Ensure data quality, consistency, and security across all data pipelines and storage systems. • Optimize performance and manage cost-efficient AWS cloud resources. Basic Qualifications: • Master's degree in Computer Science, Engineering, Analytics, Mathematics, Statistics, IT, or equivalent. • 5+ years of experience in Data Engineering and data migration projects. • Proficient in SQL and Python for data processing and analysis. • Strong experience in data modeling, data warehousing, and building data pipelines. • Hands-on experience with big data technologies like Hadoop, Spark, etc. • Expertise in AWS services including Redshift, S3, AWS Glue, EMR, Kinesis, Firehose, Lambda, and IAM. • Understanding of ETL development best practices and principles. Preferred Qualifications: • Knowledge of data security and data privacy best practices. • Experience with DevOps and CI/CD practices related to data workflows. • Familiarity with data lake architectures and real-time data streaming. • Strong problem-solving abilities and attention to detail. • Excellent verbal and written communication skills. • Ability to work independently and in a team-oriented environment. Good to Have: • Experience with orchestration tools like Airflow or Step Functions. • Exposure to BI/Visualization tools like QuickSight, Tableau, or Power BI. • Understanding of data governance and compliance standards.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greetings from LTIMindtree!! About the job Are you looking for a new career challenge? With LTIMindtree, are you ready to embark on a data-driven career? Working for global leading manufacturing client for providing an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision – will surely be a fulfilling experience. Location: Pan India. Key Skill : Hadoop-Spark SparkSQL – Java Interested candidates kindly apply in below link and share updated cv to Hemalatha1@ltimindtree.com https://forms.office.com/r/zQucNTxa2U Skills needed: 1. Hand-on Experience on Java and Big data Technology including Spark. Hive, Impala 2. Experience with Streaming Framework such as Kafka 3. Hands-on Experience with Object Storage. Should be able to develop data Archival and retrieval patters 4. Good to have experience of any Public platform like AWS, Azure, GCP etc. 5. Ready to upskill as and when needed on project technologies viz Abinitio Why join us? Work in industry leading implementations for Tier-1 clients Accelerated career growth and global exposure Collaborative, inclusive work environment rooted in innovation Exposure to best-in-class automation framework Innovation first culture: We embrace automation, AI insights and clean data Know someone who fits this perfectly? Tag them – let’s connect the right talent with right opportunity DM or email to know more Let’s build something great together
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greetings from LTIMindtree!! About the job Are you looking for a new career challenge? With LTIMindtree, are you ready to embark on a data-driven career? Working for global leading manufacturing client for providing an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision – will surely be a fulfilling experience. Location: Pan India. Key Skill : Spark +Python Interested candidates kindly apply in below link and share updated cv to Hemalatha1@ltimindtree.com https://forms.office.com/r/zQucNTxa2U Job Description Key Skill: Hadoop-Spark SparkSQL – Python Mandatory Skills: Relevant Experience in ETL and Data Engineering Strong Knowledge in Spark, Python Strong experience in Hive/SQL, PL/SQL Good Understanding of ETL & DW Concepts, Unix Scripting Design, implement and maintain Dat Pipeline to meet business requirements. Convert the Business need into Technical complex PySpark Code. Ability to write complex SQL queries for reporting purpose. Monitor Pyspark code performance and troubleshoot issues Why join us? Work in industry leading implementations for Tier-1 clients Accelerated career growth and global exposure Collaborative, inclusive work environment rooted in innovation Exposure to best-in-class automation framework Innovation first culture: We embrace automation, AI insights and clean data Know someone who fits this perfectly? Tag them – let’s connect the right talent with right opportunity DM or email to know more Let’s build something great together
Posted 1 week ago
6.0 - 11.0 years
6 - 10 Lacs
Hyderabad
Work from Office
About the Role In this opportunity, as Senior Data Engineer, you will: Develop and maintain data solutions using resources such as dbt, Alteryx, and Python. Design and optimize data pipelines, ensuring efficient data flow and processing. Work extensively with databases, SQL, and various data formats including JSON, XML, and CSV. Tune and optimize queries to enhance performance and reliability. Develop high-quality code in SQL, dbt, and Python, adhering to best practices. Understand and implement data automation and API integrations. Leverage AI capabilities to enhance data engineering practices. Understand integration points related to upstream and downstream requirements. Proactively manage tasks and work towards completion against tight deadlines. Analyze existing processes and offer suggestions for improvement. About You Youre a fit for the role of Senior Data Engineer if your background includes: Strong interest and knowledge in data engineering principles and methods. 6+ years of experience developing data solutions or pipelines. 6+ years of hands-on experience with databases and SQL. 2+ years of experience programming in an additional language. 2+ years of experience in query tuning and optimization. Experience working with SQL, JSON, XML, and CSV content. Understanding of data automation and API integration. Familiarity with AI capabilities and their application in data engineering. Ability to adhere to best practices for developing programmatic solutions. Strong problem-solving skills and ability to work independently. #LI-SS6 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 week ago
7.0 - 12.0 years
8 - 13 Lacs
Bengaluru
Work from Office
We are looking for a self-motivated individual with appetite to learn new skills and be part of a fast-paced team that is delivering cutting edge solutions that drive new products and features that are critical for our customers. Our senior software engineers are responsible for designing, developing and ensuring the quality, reliability and availability of key systems that provide critical data and algorithms.Responsibilities of this role will include developing new and enhancing existing applications and you will work collaboratively with technical leads and architect to design, develop and test these critical applications. About the role Actively participate in the full life cycle of software delivery, including analysis, design, implementation and testing of new projects and features using Hadoop, Spark/Pyspark, Scala or Java, Hive, SQL, and other open-source tools and design patterns. Python knowledge is a bonus for this role. Working experience with HUDI , Snowflake or similar Must have technologies like Big Data, AWS services like EMR, S3, Lambdas, Elastic, step functions. Actively participate in the development and testing of features for assigned projects with little to no guidance. The position holds opportunities to work under technical experts and also to provide guidance and assistance to less experienced team members or new joiners in the path of the project. Appetite for learning will be key attribute for doing well in the role as the Org is very dynamic and have tremendous scope into various technical landscapes. We consider AI inclusion as a key to excel in this role, we want dynamic candidates who use AI tools as build partners and share experiences to ignite the Org. Proactively share knowledge and best practices on using new and emerging technologies across all of the development and testing groups Create, review and maintain technical documentation of software development and testing artifacts Work collaboratively with others in a team-based environment. Identify and participate in the resolution of issues with the appropriate technical and business resources Generate innovative approaches and solutions to technology challenges Effectively balance and prioritize multiple projects concurrently. About you Bachelors or Masters degree in computer science or a related field 7+ year experience in IT industry Product and Platform development preferred. Strong programming skill with Java or Scala. Must have technologies includes Big Data, AWS. Exposure to services like EMR, S3, Lambdas, Elastic, step functions. Knowledge of Python will be preferred. Experience with Agile methodology, continuous integration and/or Test-Driven Development. Self-motivated with a strong desire for continual learning Take personal responsibility to impact results and deliver on commitments. Effective verbal and written communication skills. Ability to work independently or as part of an agile development team. #LI-SP1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 week ago
3.0 - 8.0 years
11 - 15 Lacs
Bengaluru
Work from Office
The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant.The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR.At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems.Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills. #LI-VGA1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 week ago
8.0 - 13.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Responsibilities: * Collaborate with cross-functional teams on data initiatives. * Deliver the corporate training * Develop big data solutions using Hadoop, Python & SQL. * Flexible for Offline and Online sessions
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: Data Engineer Job Summary We are seeking an experienced Data Engineer with 5-8 years of professionalexperience to design, build, and optimize robust and scalable data pipelines for our SmartFM platform. The ideal candidate will be instrumental in ingesting, transforming, and managing vast amounts of operational data from various building devices, ensuring high data quality and availability for analytics and AI/ML applications. This role is critical in enabling our platform to generate actionable insights, alerts, and recommendations for optimizing facility operations. Roles And Responsibilities Design, develop, and maintain scalable and efficient data ingestion pipelines from diverse sources (e.g., IoT devices, sensors, existing systems) using technologies like IBM StreamSets, Azure Data Factory, Apache Spark, Talend Apache Flink and Kafka. Implement robust data transformation and processing logic to clean, enrich, and structure raw data into formats suitable for analysis and machine learning models. Manage and optimize data storage solutions, primarily within MongoDB, ensuring efficient schema design, data indexing, and query performance for large datasets. Collaborate closely with Data Scientists to understand their data needs, provide high-quality, reliable datasets, and assist in deploying data-driven solutions. Ensure data quality, consistency, and integrity across all data pipelines and storage systems, implementing monitoring and alerting mechanisms for data anomalies. Work with cross-functional teams (Software Engineers, Data Scientists, Product Managers) to integrate data solutions with the React frontend and Node.js backend applications. Contribute to the continuous improvement of data architecture, tooling, and best practices, advocating for scalable and maintainable data solutions. Troubleshoot and resolve complex data-related issues, optimizing pipeline performance and ensuring data availability. Stay updated with emerging data engineering technologies and trends, evaluating and recommending new tools and approaches to enhance our data capabilities. Required Technical Skills And Experience 5-8 years of professional experience in Data Engineering or a related field. Proven hands-on experience with data pipeline tools such as IBM StreamSets, Azure Data Factory, Apache Spark, Talend Apache Flink and Apache Kafka. Strong expertise in database management, particularly with MongoDB, including schema design, data ingestion pipelines, and data aggregation. Proficiency in at least one programming language commonly used in data engineering, such as Python or Java/Scala. Experience with big data technologies and distributed processing frameworks (e.g., Apache Spark, Hadoop) is highly desirable. Familiarity with cloud platforms (Azure, AWS, or GCP) and their data services. Solid understanding of data warehousing concepts, ETL/ELT processes, and data modeling. Experience with DevOps practices for data pipelines (CI/CD, monitoring, logging). Knowledge of Node.js and React environments to facilitate seamless integration with existing applications. Additional Qualifications Demonstrated expertise in written and verbal communication, adept at simplifying complex technical concepts for both technical and non-technical audiences. Strong problem-solving and analytical skills with a meticulous approach to data quality. Experienced in collaborating and communicating seamlessly with diverse technology roles, including development, support, and product management. Highly motivated to acquire new skills, explore emerging technologies, and stay updated on the latest trends in data engineering and business needs. Experience in the facility management domain or IoT data is a plus. Education Requirements / Experience Bachelor’s (BE / BTech) / Master’s degree (MS/MTech) in Computer Science, Information Systems, Mathematics, Statistics, or a related quantitative field.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Engineer Location: Hyderabad Experience: 5+Years Job Summary: We are looking for a skilled and experienced Data Engineer with over 5 years of experience in data engineering and data migration projects. The ideal candidate should possess strong expertise in SQL, Python, data modeling, data warehousing, and ETL pipeline development. Experience with big data tools like Hadoop and Spark, along with AWS services such as Redshift, S3, Glue, EMR, and Lambda, is essential. This role provides an excellent opportunity to work on large-scale data solutions, enabling data-driven decision-making and operational excellence. Key Responsibilities: • Design, build, and maintain scalable data pipelines and ETL processes. • Develop and optimize data models and data warehouse architectures. • Implement and manage big data technologies and cloud-based data solutions. • Perform data migration, data transformation, and integration from multiple sources. • Collaborate with data scientists, analysts, and business teams to understand data needs and deliver solutions. • Ensure data quality, consistency, and security across all data pipelines and storage systems. • Optimize performance and manage cost-efficient AWS cloud resources. Basic Qualifications: • Master's degree in Computer Science, Engineering, Analytics, Mathematics, Statistics, IT, or equivalent. • 5+ years of experience in Data Engineering and data migration projects. • Proficient in SQL and Python for data processing and analysis. • Strong experience in data modeling, data warehousing, and building data pipelines. • Hands-on experience with big data technologies like Hadoop, Spark, etc. • Expertise in AWS services including Redshift, S3, AWS Glue, EMR, Kinesis, Firehose, Lambda, and IAM. • Understanding of ETL development best practices and principles. Preferred Qualifications: • Knowledge of data security and data privacy best practices. • Experience with DevOps and CI/CD practices related to data workflows. • Familiarity with data lake architectures and real-time data streaming. • Strong problem-solving abilities and attention to detail. • Excellent verbal and written communication skills. • Ability to work independently and in a team-oriented environment. Good to Have: • Experience with orchestration tools like Airflow or Step Functions. • Exposure to BI/Visualization tools like QuickSight, Tableau, or Power BI. • Understanding of data governance and compliance standards. Why Join Us? People Tech Group has significantly grown over the past two decades, focusing on enterprise applications and IT services. We are headquartered in Bellevue, Washington, with a presence across the USA, Canada, and India. We are also expanding to the EU, ME, and APAC regions. With a strong pipeline of projects and satisfied customers, People Tech has been recognized as a Gold Certified Partner for Microsoft and Oracle. Benefits: L1 Visa opportunities to the USA after 1 year of a proven track record. Competitive wages with private healthcare cover. Incentives for certifications and educational assistance for relevant courses. Support for family with maternity leave. Complimentary daily lunch and participation in employee resource groups. For more details, please visit People Tech Group.
Posted 1 week ago
5.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
This role involves the development and application of engineering practice and knowledge in the following technologiesStandards and protocols, application software and embedded software for wireless and satellite networks, fixed networks and enterprise networks; connected devices (IOT and device engineering), connected applications (5G/ edge, B2X apps); and Telco Cloud, Automation and Edge Compute platforms. This role also involves the integration of network systems and their operations, related to the above technologies. - Grade Specific Focus on Connectivity and Network Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 1 week ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Design, develop, and maintain ETL processes using Pentaho Data Integration (Kettle) . Extract data from various sources including databases, flat files, APIs, and cloud platforms. Transform and cleanse data to meet business and technical requirements. Load data into data warehouses, data lakes, or other target systems. Monitor and optimize ETL performance and troubleshoot issues. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security throughout the ETL lifecycle.Document ETL processes, data flows, and technical specifications. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 1 week ago
10.0 - 15.0 years
14 - 18 Lacs
Gurugram
Work from Office
Your Role Drive the strategic vision for AI/ML initiatives and align them with the companys broader AI roadmap. Lead cross-functional teams including data scientists, ML engineers, and architects to deliver scalable AI solutions. Translate complex technical insights into actionable business strategies and communicate them effectively to stakeholders. Oversee the full lifecycle of AI/ML projects, leveraging LLMs and generative AI to build innovative, data-driven applications. Ensure high standards of data quality, model performance, and system reliability while staying current with emerging AI trends. Your Profile Over 10 years of experience in data science with a Masters/PhD from a Tier 1 institute, specializing in AI/ML model development and deployment. Expertise in machine learning, deep learning, NLP (including LLMs like GPT, BERT, Gemini), and big data frameworks like Spark/Hadoop. Skilled in Python, R, and deep learning libraries (TensorFlow, PyTorch, Keras), with hands-on experience in cloud platforms (AWS, GCP, Azure). Strong leadership in managing data science teams and implementing MLOps practices for scalable model delivery. Excellent communication, cross-functional collaboration, and deep understanding of data privacy, security, and agile methodologies. What You Will Love Working at Capgemini Work on cutting-edge AI/ML solutions using deep learning, NLP, and LLMs like GPT, BERT, and Gemini across cloud platforms (AWS, GCP, Azure). Lead and mentor data science teams to deliver scalable, secure, and production-ready models using MLOps best practices. Clear career progression paths from engineering roles to architecture and consulting. Be part of mission-critical projects that ensure security, compliance, and operational efficiency for Fortune 500 clients.
Posted 1 week ago
4.0 - 9.0 years
5 - 9 Lacs
Mumbai
Work from Office
Your Role Administer and maintainQlik Replicateenvironments on bothLinux and Windowsplatforms. Perform regularversion upgrades, patching, and performance tuning of Qlik Replicate. Set up and manage variousendpointsand ensure secure data replication usingHTTPS/HTTP/SSL certificates. Develop and maintainautomation scriptsusingPerl, Shell, and PowerShellfor system integration and monitoring. Conductcapacity planningandtrend analysisto forecast database growth and resource requirements. Your Profile 4 to 12 yearsof experience inQlik Replicate administrationandLinux/Unix server management. Strong understanding ofrelational database architectureanddata replication concepts. Hands-on experience withQlik Replicate version upgrades, endpoint configuration, and SSL setup. Proficient inscripting languagessuch asPerl, Shell, and PowerShell. Experience inLinux system administration, including storage and filesystem setup. What You Will Love Working at Capgemini Work onenterprise-scale data replicationand infrastructure projects supporting global operations. Gain exposure tocutting-edge data integration toolsandcross-platform environments. Clear career progression fromoperations to architecture and leadership roles. Be part ofhigh-impact projectsthat drivedata reliability, availability, and performance.
Posted 1 week ago
5.0 - 9.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Your role Responsible for conducting and successfully completing the training delivery of program(s) assigned, by applying various training methods / innovation / gamification to accommodate diverse learning styles and preferences Responsible for creation / curation of new programs/ courses/ assessment/artifacts and deliver training, on need basis Collaborate closely with stakeholders from different teams (L&D, CFMG, BU, Vendor) to ensure seamless execution of training programs, with quality outcome Assess training effectiveness, analyse feedback, refine/revamp training materials to align with curriculum signed off by BU SMEs to maximize the learning outcomes and improve learner experience Upskill/upgrade on new & relevant skills aligned to the organization need, get certifications, to further enhance the expertise, stay competitive and grow professionally, while keeping abreast of the skills, fostering culture of continuous learning and development Skills: Oracle SQL/ MS SQL Server DWH & ETL Concepts UNIX Python/Java Big Data Technologies ETL Tool - Informatica Powercenter Reporting Tool Tableau/Power BI Your profile Strategize, implement, and maintain program initiatives that adhere to organizational objectives Training Delivery for I&D Skills Develop program assessment protocols for evaluation and improvement Ensure overall program goal & objectives effectively Work closely with cross-functional teams Apply change, risk and resource management Analyse, evaluate, and overcome program risks, and produce program reports for management and stakeholders Ensuring effective quality outcome and the overall integrity of the program Proactively monitoring progress, resolving issues and initiating appropriate corrective action Project management What you"ll love about working here Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you canbring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internalsports events, yoga challenges, or marathons. At Capgemini, you can work oncutting-edge projectsin tech and engineering with industry leaders or createsolutionsto overcome societal and environmental challenges.
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role involves leading and managing a team of data engineers, defining and executing the data engineering strategy, and ensuring the effective delivery of data solutions. They provide technical expertise, drive innovation, and collaborate with stakeholders to deliver high-quality, scalable, and reliable data infrastructure and solutions.
Posted 1 week ago
10.0 - 15.0 years
12 - 20 Lacs
Pune
Hybrid
Database Developer Company:Kiya.ai Work Location: Pune Work Mode:Hybrid JD: DataStrong knowledge of and hands-on development experience in Oracle PLSQL - Strong knowledge of and hands-on development *** experience SQL analytic functions*** - Experience with *** developing complex, numerically-intense business logic *** - Good knowledge of & experience in database performance tuning - Fluency in UNIX scripting Good-to-have - Knowledge of/experience in any of python, Hadoop/Hive/Impala, horizontally scalable databases, columnar databases - Oracle certifications - Any of DevOps tools/techniques CICD, Jenkins/GitLab, source control/git, deployment automation such Liquibase - Experience with Productions issues/deployments **Interested candidates drop your resume to saarumathi.r@kiya.ai **
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Ready to shape the future of work? At Genpact, we dont just adapt to changewe drive it. AI and digital innovation are redefining industries, and were leading the charge. Genpacts AI Gigafactory , our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant, AWS DataLake! Responsibilities Having knowledge on DataLake on AWS services with exposure to creating External Tables and spark programming. The person shall be able to work on python programming. Writing effective and scalable Python codes for automations, data wrangling and ETL. ¢ Designing and implementing robust applications and work on Automations using python codes. ¢ Debugging applications to ensure low-latency and high-availability. ¢ Writing optimized custom SQL queries ¢ Experienced in team and client handling ¢ Having prowess in documentation related to systems, design, and delivery. ¢ Integrate user-facing elements into applications ¢ Having the knowledge of External Tables, Data Lake concepts. ¢ Able to do task allocation, collaborate on status exchanges and getting things to successful closure. ¢ Implement security and data protection solutions ¢ Must be capable of writing SQL queries for validating dashboard outputs ¢ Must be able to translate visual requirements into detailed technical specifications ¢ Well versed in handling Excel, CSV, text, json other unstructured file formats using python. ¢ Expertise in at least one popular Python framework (like Django, Flask or Pyramid) ¢ Good understanding and exposure on any Git, Bamboo, Confluence and Jira. ¢ Good in Dataframes and SQL ANSI using pandas. ¢ Team player, collaborative approach and excellent communication skills Qualifications we seek in you! Minimum Qualifications ¢BE/B Tech/ MCA ¢Excellent written and verbal communication skills ¢Good knowledge of Python, Pyspark Preferred Qualifications/ Skills ¢ Strong ETL knowledge on any ETL tool good to have. ¢ Good to have knowledge on AWS cloud and Snowflake. ¢ Having knowledge of PySpark is a plus. Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About The Advanced Analytics Team The central Advanced Analytics team at the Abbott Established Pharma Division’s (EPD) headquarters in Basel helps define and lead the transformation towards becoming a global, data-driven company with the help of data and advanced technologies (e.g., Machine Learning, Deep Learning, Generative AI, Computer Vision). To us, Advanced Analytics is an important lever to reach our business targets, now and in the future; It helps differentiate ourselves from our competition and ensure sustainable revenue growth at optimal margins. Hence the central AA team is an integral part of the Strategy Management Office at EPD that has a very close link and regular interactions with the EPD Senior Leadership Team. Primary Job Function With the above requirements in mind, EPD is looking to fill a role of a Data Scientist to build and refine effective Data Science Solutions for Abbott EPD world-wide. Core Job Responsibilities The Data Scientist rapidly navigates from identifying priorities and helping to generate ideas to implementing solutions. They Participate/drive data collection, cleaning, analysis and interpretation (EDA). Collaborate with the business partner and product owners to ideate on solutions to challenging problems. Generate insightful visualizations to communicate findings. Carry out model selection, validation and possible ways for deployment (in collaboration with the engineering team). Write high quality code with possibility of deployment in mind. Share the learnings and findings with other data scientists contributing to the collaborative environment. Collaborate with Sr. Data Scientists and take full responsibility for analysis and modeling tasks. Build effective and efficient AA solutions to business needs, leveraging available market resources as much as possible. Keep himself/herself committed to continuous learning about the latest trends and technologies. Work closely with the Product Owners and the Engineering team to ensure delivery of the Data Science part of the projects within time, cost and quality. Collaborate with external vendors, evaluating their capabilities and ensuring their alignment with data science standards and project requirements. Continuously engage in hands-on data analysis, modeling, and prototyping DS frameworks to deliver high-quality outputs. Supervisory/Management Responsibilities Direct Reports: None. Indirect Reports: None. Position Accountability/Scope The Data Scientist is responsible for delivering targeted business impact per initiative in collaboration with key stakeholders and identifying next steps/future impactful opportunities. This individual contributor role involves working with cross-functional teams to build innovative solutions for internal business functions across different geographies. Minimum Education Master or PhD in relevant field (e.g., applied mathematics, computer science, engineering, applied statistics) Minimum Experience At least 3-5 years of relevant working experience, ideally in pharma environment Solid experience working on full-life cycle data science; experience in applying data science methods to business problems (experience in the financial/commercial or manufacturing / supply chain areas a plus). Strong experience in e.g., data mining, statistical modelling, predictive modelling, and development of machine learning algorithms Proven problem-solving ability in international settings preferably with developing markets Proven experience in working in cloud environment preferably AWS / Sagemaker Strong experience working on full-life cycle data science; experience in applying data science methods to business problems Practical experience in deploying machine learning solutions Strong understanding of good software engineering principles and best practices Ability to work and lead cross-functional teams to bring business and data science closer together - consultancy experience a plus Intrinsic motivation to guide people and make Advanced Analytics more accessible to a broader range of stakeholders Deep domain expertise in a specific field, such as Artificial Intelligence, Machine Learning, Natural Language Processing, or Computer Vision Strong programming skills in languages such as Python or R, with proficiency in data manipulation, wrangling, and modeling techniques Strong experience building and debugging complex SQL queries Excellent knowledge of statistical techniques, machine learning algorithms, and their practical implementation in real-world scenarios Exceptional communication and presentation skills, with the ability to convey complex concepts and insights to both technical and non-technical stakeholders Proven track record of delivering data-driven solutions that have had a measurable impact on business outcomes Exposure to big data technologies (e.g., Hadoop, Spark) is highly desirable Demonstrated ability to drive the adoption of data science best practices, standards, and methodologies within an organization Fluency in English a must, additional languages a plus
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Boomi And What Makes Us Special Are you ready to work at a fast-growing company where you can make a difference? Boomi aims to make the world a better place by connecting everyone to everything, anywhere. Our award-winning, intelligent integration and automation platform helps organizations power the future of business. At Boomi, you’ll work with world-class people and industry-leading technology. We hire trailblazers with an entrepreneurial spirit who can solve challenging problems, make a real impact, and want to be part of building something big. If this sounds like a good fit for you, check out boomi.com or visit our Boomi Careers page to learn more. As a Boomi Product Support Engineer, you are an enthusiastic troubleshooter with a passion for helping customers in a fast-paced, collaborative environment. You’re able to quickly understand customer challenges, identify the root cause, and find creative solutions to technical problems. A skilled communicator, you’re able to clearly share your knowledge and recommendations to a wide audience including those with technical and non-technical backgrounds. You are committed to delivering excellence to our customers, our company, and our colleagues. You're a valued member of the Boomi Global Customer Support team, empowering our customers to optimize the Boomi platform to achieve their business outcomes. Our global team provides around-the-clock support to our customers to ensure their success with the Boomi platform. What you’ll do: Provide exceptional engagement for our customer’s initial contact with the Global Customer Support team Engage with customers through virtual meetings, chat, and email to manage their expectations, set priorities and resolve technical issues related to our products, including configuration and networking Acknowledge customer’s concerns, empathizing and analyzing the information they’ve provided, and asking questions that refine your initial analysis Utilize technical knowledge to ensure timely, accurate solutions, and determine when deeper technical investigation and collaboration is necessary Collaborate with Product and Engineering teams providing customer feedback to help identify new features and functions. Work hours: 5:30pm - 2:30am IST (Monday - Friday); hours & day flexibility; Hyderabad hybrid The experience you bring: 3 - 4 years work experience with 1 - 2 years customer-facing experience Ability to explain technical details to both technical and non-technical audiences Basic knowledge of programming and scripting languages, such as Java, React, Groovy, Java Script Basic knowledge of Windows and Linux OS Basic knowledge of cloud-based software applications, (including installation, administration, and troubleshooting) Able to show patience, empathy, and compassion Passion for problem-solving, continuous learning, and staying up to date on new technology and trends Bonus points if you have: Boomi platform certifications and/or knowledge Windows and Linux OS experience Cloud-based software application experience, including installation, administration, and troubleshooting Ability to analyze error logs for Java programs, Windows OS, Linux OS Ability to read, write, and interpret multiple programming and scripting languages, such as Java, React, Groovy, Java Script Advanced knowledge of performance tuning techniques and tools Amazon Web Services (AWS), Microsoft Azure, and/or Google Cloud Platform (GCP) Understanding of database administration Understanding of network fundamentals, including network trace analysis API design and development experience Thorough understanding of how data is transmitted securely across the network NetSuite, Salesforce, Hadoop, Linux system administration Knowledge of Postman and OAuth 2.0 IT Consultant or Software Developer experience Be Bold. Be You. Be Boomi. We take pride in our culture and core values and are committed to being a place where everyone can be their true, authentic self. Our team members are our most valuable resources, and we look for and encourage diversity in backgrounds, thoughts, life experiences, knowledge, and capabilities. All employment decisions are based on business needs, job requirements, and individual qualifications. Boomi strives to create an inclusive and accessible environment for candidates and employees. If you need accommodation during the application or interview process, please submit a request to talent@boomi.com. This inbox is strictly for accommodations, please do not send resumes or general inquiries.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France