Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for an immediate joiner and experienced Big Data Developer with a strong background in Kafka, PySpark, Python/Scala, Spark, SQL, and the Hadoop ecosystem. The ideal candidate should have over 5 years of experience and be ready to join immediately. This role requires hands-on expertise in big data technologies and the ability to design and implement robust data processing solutions. Responsibilities Design, develop, and maintain scalable data processing pipelines using Kafka, PySpark, Python/Scala, and Spark. Work extensively with the Kafka and Hadoop ecosystem, including HDFS, Hive, and other related technologies. Write efficient SQL queries for data extraction, transformation, and analysis. Implement and manage Kafka streams for real-time data processing. Utilize scheduling tools to automate data workflows and processes. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity by implementing robust data validation processes. Optimize existing data processes for performance and scalability. Requirements Experience with GCP. Knowledge of data warehousing concepts and best practices. Familiarity with machine learning and data analysis tools. Understanding of data governance and compliance standards. This job was posted by Arun Kumar K from krtrimaIQ Cognitive Solutions. Show more Show less
Posted 6 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
The Opportunity "We are seeking a senior software engineer to undertake a range of feature development tasks that continue the evolution of our DMP Streaming product. You will demonstrate the required potential and technical curiosity to work on software that utilizes a range of leading edge technologies and integration frameworks. Given your depth of experience, we also want you to technically guide more junior members of the team, instilling both good engineering practices and inspiring them to grow" What You'll Contribute Implement product changes, undertaking detailed design, programming, unit testing and deployment as required by our SDLC process Investigate and resolve reported software defects across supported platforms Work in conjunction with product management to understand business requirements and convert them into effective software designs that will enhance the current product offering Produce component specifications and prototypes as necessary Provide realistic and achievable project estimates for the creation and development of solutions. This information will form part of a larger release delivery plan Develop and test software components of varying size and complexity Design and execute unit, link and integration test plans, and document test results. Create test data and environments as necessary to support the required level of validation Work closely with the quality assurance team and assist with integration testing, system testing, acceptance testing, and implementation Produce relevant system documentation Participate in peer review sessions to ensure ongoing quality of deliverables. Validate other team members' software changes, test plans and results Maintain and develop industry knowledge, skills and competencies in software development What We're Seeking A Bachelor’s or Master’s degree in Computer Science, Engineering, or related field 10+ Java software development experience within an industry setting Ability to work in both Windows and UNIX/Linux operating systems Detailed understanding of software and testing methods Strong foundation and grasp of design models and database structures Proficient in Kubernetes, Docker, and Kustomize Exposure to the following technologies: Apache Storm, MySQL or Oracle, Kafka, Cassandra, OpenSearch, and API (REST) development Familiarity with Eclipse, Subversion and Maven Ability to lead and manage others independently on major feature changes Excellent communication skills with the ability to articulate information clearly with architects, and discuss strategy/requirements with team members and the product manager Quality-driven work ethic with meticulous attention to detail Ability to function effectively in a geographically-diverse team Ability to work within a hybrid Agile methodology Understand the design and development approaches required to build a scalable infrastructure/platform for large amounts of data ingestion, aggregation, integration and advanced analytics Experience of developing and deploying applications into AWS or a private cloud Exposure to any of the following: Hadoop, JMS, Zookeeper, Spring, JavaScript, Angular, UI Development Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us Acceldata is the market leader in Enterprise Data Observability. Founded in 2018, Silicon Valley-based Acceldata has developed the world's first Enterprise Data Observability Platform to help build and operate great data products. Enterprise Data Observability is at the intersection of today’s hottest and most crucial technologies such as AI, LLMs, Analytics, and DataOps. Acceldata provides mission-critical capabilities that deliver highly trusted and reliable data to power enterprise data products. Delivered as a SaaS product, Acceldata's solutions have been embraced by global customers, such as HPE, HSBC, Visa, Freddie Mac, Manulife, Workday, Oracle, PubMatic, PhonePe (Walmart), Hersheys, Dun & Bradstreet, and many more. Acceldata is a Series-C funded company whose investors include Insight Partners, March Capital, Lightspeed, Sorenson Ventures, Industry Ventures, and Emergent Ventures. About the Role: We are looking for an experienced Lead SDET for our ODP, specializing in ensuring the quality and performance of large-scale data systems. In this role, you will work closely with development and operations teams to design and execute comprehensive test strategies for Open Source Data Platform (ODP) , including Hadoop, Spark, Hive, Kafka, and other related technologies. You will focus on test automation, performance tuning, and identifying bottlenecks in distributed data systems. Your key responsibilities will include writing test plans, creating automated test scripts, and conducting functional, regression, and performance testing. You will be responsible for identifying and resolving defects, ensuring data integrity, and improving testing processes. Strong collaboration skills are essential as you will be interacting with cross-functional teams and driving quality initiatives. Your work will directly contribute to maintaining high-quality standards for big data solutions and enhancing their reliability at scale. You are a great fit for this role if you have Proven expertise in Quality Engineering, with a strong background in test automation, performance testing, and defect management across multiple data platforms. A proactive mindset to define and implement comprehensive test strategies that ensure the highest quality standards are met. Experience in working with both functional and non-functional testing, with a particular focus on automated test development. A collaborative team player with the ability to effectively work cross-functionally with development teams to resolve issues and deliver timely fixes. Strong communication skills with the ability to mentor junior engineers and share knowledge to improve testing practices across the team. A commitment to continuous improvement, with the ability to analyze testing processes and recommend enhancements to align with industry best practices. Ability to quickly learn new technologies What We Look For 6-10 years of hands-on experience in quality engineering and quality assurance, focusing on test automation, performance testing, and defect management across multiple data platforms Proficiency in programming languages such as Java, Python, or Scala for writing test scripts and automating test cases with hands-on experience in developing automated tests using other test automation frameworks, ensuring robust and scalable test suites. Proven ability to define and execute comprehensive test strategies, including writing test plans, test cases, and scripts for both functional and non-functional testing to ensure predictable delivery of high-quality products and solutions Experience with version control systems like Git and CI/CD tools such as Jenkins or GitLab CI to manage code changes and automate test execution within the development pipeline. Expertise in identifying, tracking, and resolving defects and issues, collaborating closely with developers and product teams to ensure timely fixes. Strong communication skills with the ability to work cross-functionally with development teams and mentor junior team members to improve testing practices and tools. Ability to analyze testing processes, recommend improvements and ensure the testing environment aligns with industry best practices, contributing to the overall quality of software. Acceldata is an equal-opportunity employer At Acceldata, we are committed to providing equal employment opportunities regardless of job history, disability, gender identity, religion, race, color, caste, marital/parental status, veteran status, or any other special status. We stand against the discrimination of employees and individuals and are proud to be an equitable workplace that welcomes individuals from all walks of life if they fit the designated roles and responsibilities. is all about working with some of the best minds in the industry and experiencing a culture that values an ‘out-of-the-box’ mindset. If you want to push boundaries, learn continuously, and grow to be the best version of yourself, Acceldata is the place to be! Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
India
On-site
About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Position Overview As a Sr. Data Engineer at Oportun, you will be a key member of our team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the product - for large initiatives (cross-functional and multi-month-long projects). Responsibilities Data Architecture and Design: Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements. Collaborate with stakeholders to understand data requirements, build subject matter expertise, and define optimal data models and structures. Data Pipeline Development And Optimization Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data. Optimize data pipelines for performance, reliability, and scalability. Database Management And Optimization Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security. Implement and manage ETL processes for efficient data loading and retrieval. Data Quality And Governance Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations. Drive initiatives to improve data quality and documentation of data assets. Mentorship And Leadership Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth. Lead and participate in code reviews, ensuring best practices and high-quality code. Collaboration And Stakeholder Management Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs. Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value. Performance Monitoring And Optimization Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability. Common Requirements You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective solutions. You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility. You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team. You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team. You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems. You set the benchmark for responsiveness and ownership and overall accountability of engineering systems. You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues Qualifications Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management. Proficiency in programming languages like Python/PySpark and Java or Scala Expertise in big data technologies such as Hadoop, Spark, Kafka, etc. In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MariaDB, NoSQL databases). Experience and expertise in building complex end-to-end data pipelines. Experience with orchestration and designing job schedules using the CICD tools like Jenkins, Airflow or Databricks Ability to work in an Agile environment (Scrum, Lean, Kanban, etc) Ability to mentor junior team members. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse). Strong leadership, problem-solving, and decision-making skills. Excellent communication and collaboration abilities. Familiarity or certification in Databricks is a plus. We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate. California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/. We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3). Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Amazon Selection and Catalog Systems (ASCS) builds the systems that host and run the comprehensive e-commerce product catalog. We power the online shopping experience for customers worldwide, enabling them to find, discover, and purchase anything they desire. Our scaled, distributed systems process hundreds of millions of updates across billions of products, including physical, digital, and service offerings. You will be part of Catalog Support Programs (CSP) team under Catalog Support Operations (CSO) in ASCS Org. CSP provides program management, technical support, and strategic initiatives to enhance the customer experience, owning the implementation of business logic and configurations for ASCS. We are establishing a new centralized Business Intelligence team to build self-service analytical products for ASCS that provide relevant insights and data deep dives across the business. By leveraging advanced analytics and AI/ML, we will transform catalog data into predictive insights, helping prevent customer issues before they arise. Real-time intelligence will support proactive decision-making, enabling faster, data-driven decisions across the organization and driving long-term growth and an enhanced customer experience. We are looking for a creative and goal-oriented BI Engineer to join our team to harness the full potential of data-driven insights to make informed decisions, identify business opportunities and drive business growth. This role requires an individual with excellent analytical abilities, knowledge of business intelligence solutions, as well as business acumen and the ability to work with various tech/product teams across ASCS. This BI Engineer will support ASCS org by owning complex reporting and automating reporting solutions, and ultimately provide insights and drivers for decision making. You must be a self-starter and be able to learn on the go. You should have excellent written and verbal communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. As a Business Intelligence Engineer in the CSP team, you will be responsible for analyzing petabytes of data to identify business trends and points of customer friction, and developing scalable solutions to enhance customer experience and safety. You will work closely with internal stakeholders to define key performance indicators (KPIs), implement them into dashboards and reports, and present insights in a concise and effective manner. This role will involve collaborating with business and tech leaders within ASCS and cross-functional teams to solve problems, create operational efficiencies, and deliver against high organizational standards. You should be able to apply a breadth of tools, data sources, and analytical techniques to answer a wide range of high-impact business questions and proactively uncover new insights that drive decision-making by senior leadership. As a key member of the CSP team, you will continually raise the bar on both quality and performance. You will bring innovation, a strategic perspective, a passionate voice, and an ability to prioritize and execute on a fast-moving set of priorities, competitive pressures, and operational initiatives. There will be a steep learning curve, adding a fair amount of business skills to the individual. Key job responsibilities Work closely with BIEs, Data Engineers, and Scientists in the team to collaborate effectively with product managers and create scalable solutions for business problems Create program goals and related metrics, track progress, and manage through obstacles to help the team achieve objectives Identify opportunities for improvement or automation in existing data processes and lead the changes using business acumen and data handling skills Ensure best practices on data integrity, design, testing, implementation, documentation, and knowledge sharing Contribute to supplier operations strategy development based on data analysis Lead strategic projects to formalize and scale organizational processes Build and manage weekly, monthly, and quarterly business review metrics Build data reports and dashboards using SQL, Excel, and other tools to improve business efficiency across programs Understand loosely defined or structured problems and provide BI solutions for difficult problems, delivering large-scale BI solutions Provide solutions that drive the team's business decisions and highlight new opportunities Improve code quality and optimize BI processes Demonstrate proficiency in a scripting language, data modeling, data pipeline design, and applying basic statistical methods (e.g., regression) for difficult business problems A day in the life A day in the life of a BIE-II will include: Working closely with cross-functional teams including Product/Program Managers, Software Development Managers, Applied/Research/Data Scientists, and Software Developers Building dashboards, performing root cause analysis, and sharing actionable insights with stakeholders to enable data-informed decision making Leading reporting and analytics initiatives to drive data-informed decision making Designing, developing, and maintaining ETL processes and data visualization dashboards using Amazon QuickSight Transforming complex business requirements into actionable analytics solutions. About The Team This central BIE team within ASCS will be responsible for building a structured analytical data layer, bringing in BI discipline by defining metrics in a standardized way and establishing a single definition of metrics across the catalog ecosystem. They will also identify clear sources of truth for critical data. The team will build and maintain the data pipelines for critical projects tailored to the needs of ASCS teams, leveraging catalog data to provide a unified view of product information. This will support real-time decision-making and empower teams to make data-driven decisions quickly, driving innovation. This team will leverage advanced analytics that can shift us to a proactive, data-driven approach, enabling informed decisions that drive growth and enhance the customer experience. This team will adopt best practices, standardize metrics, and continuously iterate on queries and data sets as they evolve. Automated quality controls and real-time monitoring will ensure consistent data quality across the organization. Basic Qualifications 4+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business Experience writing complex SQL queries Bachelor's degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field Experience with scripting languages (e.g., Python, Java, R) and big data technologies/languages (e.g. Spark, Hive, Hadoop, PyTorch, PySpark) to build and maintain data pipelines and ETL processes Demonstrate proficiency in SQL, data analysis, and data visualization tools like Amazon QuickSight to drive data-driven decision making. Experience applying basic statistical methods (e.g. regression, t-test, Chi-squared) as well as exploratory, deterministic, and probabilistic analysis techniques to solve complex business problems. Experience gathering business requirements, using industry standard business intelligence tool(s) to extract data, formulate metrics and build reports. Track record of generating key business insights and collaborating with stakeholders. Strong verbal and written communication skills, with the ability to effectively present data insights to both technical and non-technical audiences, including senior management Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Master's degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field Proven track record of conducting large-scale, complex data analysis to support business decision-making in a data warehouse environment Demonstrated ability to translate business needs into data-driven solutions and vice versa Relentless curiosity and drive to explore emerging trends and technologies in the field Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis, as well as exploratory, deterministic, and probabilistic analysis techniques Experience in designing and implementing custom reporting systems using automation tools Knowledge of how to improve code quality and optimizes BI processes (e.g. speed, cost, reliability) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2990532 Show more Show less
Posted 6 days ago
2.0 years
0 Lacs
Hyderābād
On-site
Job Description: We are looking for experienced and passionate Technical Trainers to join our team to deliver engaging training sessions in the following domains: Python Programming Java Development Data Science (ML, AI, Statistics, Visualization) Microsoft Azure (Cloud, DevOps, Data Engineering) Data Engineering (ETL, Big Data, SQL, Python) Responsibilities: Design, develop, and deliver effective training content and materials (theory + hands-on) Conduct online/offline training sessions as per the schedule Provide real-time examples and projects for better understanding Guide learners through assessments, case studies, and live projects Monitor the progress of trainees and provide feedback Stay updated with current technologies and industry trends Address student doubts and ensure clarity of concepts Requirements: Strong subject knowledge in the chosen domain Excellent communication and presentation skills Prior training or mentoring experience preferred Ability to simplify complex technical concepts Bachelor's/Master’s degree in Computer Science/Engineering or related field Certification in respective technology is a plus (e.g., Microsoft Certified: Azure Data Engineer, etc.) Experience Required: 2+ years (Freshers with strong knowledge can also apply) Salary: As per industry standards Preferred Skills by Domain:Python Trainer Core Python, OOPs, File Handling, Libraries (Pandas, NumPy, Matplotlib) Hands-on coding and debugging knowledge Java Trainer Core & Advanced Java, JDBC, Collections, Spring Framework Project implementation guidance Data Science Trainer Python/R, Statistics, Machine Learning, Deep Learning Tools: Jupyter, Power BI, Tableau, SQL Azure Trainer Azure Fundamentals, Azure Data Factory, Azure DevOps Experience in real-world Azure projects Data Engineering Trainer Data pipelines, ETL, SQL, Python, Hadoop/Spark Cloud-based data solutions (AWS/GCP/Azure How to Apply: Interested candidates can send their resume Job Type: Full-time Schedule: Day shift Work Location: In person
Posted 6 days ago
0 years
4 - 7 Lacs
Hyderābād
On-site
Experience -8 plus Location-Hyderabad Notice period-Only immediate joiners 8 plus years of Strong ETL Informatica experience. Should have Oracle, Hadoop, MongoDB experience. Strong SQL/Unix knowledge. Experience in working with RDBMS. Preference to have Teradata. Good to have Bigdata/Hadoop experience. Good to have Python or any programming knowledge. Your future duties and responsibilities Required qualifications to be successful in this role B Tech Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 6 days ago
3.0 years
10 - 12 Lacs
India
On-site
About the Role We are seeking a highly skilled Senior AI/ML Engineer to join our dynamic team. The ideal candidate will have extensive experience in designing, building, and deploying machine learning models and AI solutions to solve real-world business challenges. You will collaborate with cross-functional teams to create and integrate AI/ML models into end-to-end applications, ensuring models are accessible through APIs or product interfaces for real-time usage. Responsibilities Lead the design, development, and deployment of machine learning models for various use cases such as recommendation systems, computer vision, natural language processing (NLP), and predictive analytics. Work with large datasets to build, train, and optimize models using techniques such as classification, regression, clustering, and neural networks. Fine-tune pre-trained models and develop custom models based on specific business needs. Collaborate with data engineers to build scalable data pipelines and ensure the smooth integration of models into production. Collaborate with frontend/backend engineers to build AI-driven features into products or platforms. Build proof-of-concept or production-grade AI applications and tools with intuitive UIs or workflows. Ensure scalability and performance of deployed AI solutions within the full application stack. Implement model monitoring and maintenance strategies to ensure performance, accuracy, and continuous improvement of deployed models. Design and implement APIs or services that expose machine learning models to frontend or other systems Utilize cloud platforms (AWS, GCP, Azure) to deploy, manage, and scale AI/ML solutions. Stay up-to-date with the latest advancements in AI/ML research, and apply innovative techniques to improve existing systems. Communicate effectively with stakeholders to understand business requirements and translate them into AI/ML-driven solutions. Document processes, methodologies, and results for future reference and reproducibility. Required Skills & Qualifications Experience: 3-5+ years of experience in AI/ML engineering roles, with a proven track record of successfully delivering machine learning projects. AI/ML Expertise: Strong knowledge of machine learning algorithms (supervised, unsupervised, reinforcement learning) and AI techniques, including NLP, computer vision, and recommendation systems. Programming Languages: Proficient in Python and relevant ML libraries such as TensorFlow, PyTorch, Scikit-learn, and Keras. Data Manipulation: Experience with data manipulation libraries such as Pandas, NumPy, and SQL for managing and processing large datasets. Model Development: Expertise in building, training, deploying, and fine-tuning machine learning models in production environments. Cloud Platforms: Experience with cloud platforms such as AWS, GCP, or Azure for the deployment and scaling of AI/ML models. MLOps: Knowledge of MLOps practices for model versioning, automation, and monitoring. Data Preprocessing: Proficient in data cleaning, feature engineering, and preparing datasets for model training. Strong experience building and deploying end-to-end AI-powered applications— not just models but full system integration. Hands-on experience with Flask, FastAPI, Django, or similar for building REST APIs for model serving. Understanding of system design and software architecture for integrating AI into production environments. Experience with frontend/backend integration (basic React/Next.js knowledge is a plus). Demonstrated projects where AI models were part of deployed user-facing applications. NLP & Computer Vision: Hands-on experience with natural language processing or computer vision projects. Big Data: Familiarity with big data tools and frameworks (e.g., Apache Spark, Hadoop) is an advantage. Problem-Solving Skills: Strong analytical and problem-solving abilities, with a focus on delivering practical AI/ML solutions. Nice to Have Experience with deep learning architectures (CNNs, RNNs, GANs, etc.) and techniques. Knowledge of deployment strategies for AI models using APIs, Docker, or Kubernetes. Experience building full-stack applications powered by AI (e.g., chatbots, recommendation dashboards, AI assistants, etc.). Experience deploying AI/ML models in real-time environments using API gateways, microservices, or orchestration tools like Docker and Kubernetes. Solid understanding of statistics and probability. Experience working in Agile development environments. What You'll Gain Be part of a forward-thinking team working on cutting-edge AI/ML technologies. Collaborate with a diverse, highly skilled team in a fast-paced environment. Opportunity to work on impactful projects with real-world applications. Competitive salary and career growth opportunities Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Schedule: Day shift Morning shift Supplemental Pay: Performance bonus Work Location: In person
Posted 6 days ago
8.0 years
8 - 9 Lacs
Gurgaon
On-site
You Lead the Way. We've Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities, and each other. Here, you'll learn and grow as we help you create a career journey that's unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you'll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company's success. Together, we'll win as a team, striving to uphold our company values and powerful backing promise to provide the world's best customer experience every day. And we'll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. American Express has embarked on an exciting transformation driven by an energetic new team of high performers. This is a great opportunity to join the Customer Marketing organization within American Express Technologies and become a driver of this exciting journey. We are looking for a highly skilled and experienced Senior Engineer with a history of building Bigdata, GCP Cloud, Python and Spark applications. The Senior Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization's data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Joining the Enterprise Marketing team, this role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. We pride ourselves on a culture of kindness and positivity, and a continuous focus on supporting colleague development to help you achieve your career goals. We lead with integrity, and we emphasize work/life balance for all of our teammates. How will you make an impact in this role? There are hundreds of opportunities to make your mark on technology and life at American Express. Here's just some of what you'll be doing: As a part of our team, you will be developing innovative, high quality, and robust operational engineering capabilities. Develop software in our technology stack which is constantly evolving but currently includes Big data, Spark, Python, Scala, GCP, Adobe Suit ( like Customer Journey Analytics ). Work with Business partners and stakeholders to understand functional requirements, architecture dependencies, and business capability roadmaps. Create technical solution designs to meet business requirements. Define best practices to be followed by team. Taking your place as a core member of an Agile team driving the latest development practices Identify and drive reengineering opportunities, and opportunities for adopting new technologies and methods. Suggest and recommend solution architecture to resolve business problems. Perform peer code review and participate in technical discussions with the team on the best solutions possible. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers' digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. American Express offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology of #TeamAmex. Minimum Qualifications : BS or MS degree in computer science, computer engineering, or other technical discipline, or equivalent work experience. 8+ years of hands-on software development experience with Big Data & Analytics solutions – Hadoop Hive, Spark, Scala, Hive, Python, shell scripting, GCP Cloud Big query, Big Table, Airflow. Working knowledge of Adobe suit like Adobe Experience Platform, Adobe Customer Journey Analytics Proficiency in SQL and database systems, with experience in designing and optimizing data models for performance and scalability. Design and development experience with Kafka, Real time ETL pipeline, API is desirable. Experience in designing, developing, and optimizing data pipelines for large-scale data processing, transformation, and analysis using Big Data and GCP technologies. Certifications in cloud platform (GCP Professional Data Engineer) is a plus. Understanding of distributed (multi-tiered) systems, data structures, algorithms & Design Patterns. Strong Object-Oriented Programming skills and design patterns. Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git, Maven). Good knowledge and experience with configuration management tools like GitHub Ability to analyze complex data engineering problems, propose effective solutions, and implement them effectively. Looks proactively beyond the obvious for continuous improvement opportunities. Communicates effectively with product and cross functional team. Willingness to learn new technologies and leverage them to their optimal potential. Understanding of various SDLC methodologies, familiarity with Agile & scrum ceremonies. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 6 days ago
3.0 years
5 - 8 Lacs
Gurgaon
Remote
Job description About this role Want to elevate your career by being a part of the world's largest asset manager? Do you thrive in an environment that fosters positive relationships and recognizes stellar service? Are analyzing complex problems and identifying solutions your passion? Look no further. BlackRock is currently seeking a candidate to become part of our Global Investment Operations Data Engineering team. We recognize that strength comes from diversity, and will embrace your rare skills, eagerness, and passion while giving you the opportunity to grow professionally and as an individual. We know you want to feel valued every single day and be recognized for your contribution. At BlackRock, we strive to empower our employees and actively engage your involvement in our success. With over USD $11.5 trillion of assets under management, we have an extraordinary responsibility: our technology and services empower millions of investors to save for retirement, pay for college, buy a home and improve their financial well-being. Come join our team and experience what it feels like to be part of an organization that makes a difference. Technology & Operations Technology & Operations(T&O) is responsible for the firm's worldwide operations across all asset classes and geographies. The operational functions are aligned with clients, products, fund structures and our Third-party provider networks. Within T&O, Global Investment Operations (GIO) is responsible for the development of the firm's operating infrastructure to support BlackRock's investment businesses worldwide. GIO spans Trading & Market Documentation, Transaction Management, Collateral Management & Payments, Asset Servicing including Corporate Actions and Cash & Asset Operations, and Securities Lending Operations. GIO provides operational service to BlackRock's Portfolio Managers and Traders globally as well as industry leading service to our end clients. GIO Engineering Working in close partnership with GIO business users and other technology teams throughout Blackrock, GIO Engineering is responsible for developing and providing data and software solutions that support GIO business processes globally. GIO Engineering solutions combine technology, data, and domain expertise to drive exception-based, function-agnostic, service-orientated workflows, data pipelines, and management dashboards. The Role – GIO Engineering Data Lead Work to date has been focused on building out robust data pipelines and lakes relevant to specific business functions, along with associated pools and Tableau / PowerBI dashboards for internal BlackRock clients. The next stage in the project involves Azure / Snowflake integration and commercializing the offering so BlackRock’s 150+ Aladdin clients can leverage the same curated data products and dashboards that are available internally. The successful candidate will contribute to the technical design and delivery of a curated line of data products, related pipelines, and visualizations in collaboration with SMEs across GIO, Technology and Operations, and the Aladdin business. Responsibilities Specifically, we expect the role to involve the following core responsibilities and would expect a successful candidate to be able to demonstrate the following (not in order of priority) Design, develop and maintain a Data Analytics Infrastructure Work with a project manager or drive the project management of team deliverables Work with subject matter experts and users to understand the business and their requirements. Help determine the optimal dataset and structure to deliver on those user requirements Work within a standard data / technology deployment workflow to ensure that all deliverables and enhancements are provided in a disciplined, repeatable, and robust manner Work with team lead to understand and help prioritize the team’s queue of work Automate periodic (daily/weekly/monthly/Quarterly or other) reporting processes to minimize / eliminate associated developer BAU activities. Leverage industry standard and internal tooling whenever possible in order to reduce the amount of custom code that requires maintenance Experience 3+ years of experience in writing ETL, data curation and analytical jobs using Hadoop-based distributed computing technologies: Spark / PySpark, Hive, etc. 3+ years of knowledge and Experience of working with large enterprise databases preferably Cloud bases data bases/ data warehouses like Snowflake on Azure or AWS set-up Knowledge and Experience in working with Data Science / Machine / Gen AI Learning frameworks in Python, Azure/ openAI, meta tec. Knowledge and Experience building reporting and dashboards using BI Tools: Tableau, MS PowerBI, etc. Prior Experience working on Source Code version Management tools like GITHub etc. Prior experience working with and following Agile-based workflow paths and ticket-based development cycles Prior Experience setting-up infrastructure and working on Big Data analytics Strong analytical skills with the ability to collect, organize, analyse, and disseminate significant amounts of information with attention to detail and accuracy Experience working with SMEs / Business Analysts, and working with Stakeholders for sign-off Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Job Requisition # R254094
Posted 6 days ago
12.0 years
0 Lacs
New Delhi, Delhi, India
On-site
As a Senior Manager for data science, data modelling & Analytics, you will lead a team of data scientists and analysts while actively contributing to the development and implementation of advanced analytics solutions. This role requires a blend of strategic leadership and hands-on technical expertise to drive data-driven decision-making across the organization. Job Description: Key Responsibilities Hands-On Technical Contribution Design, develop, and deploy advanced machine learning models and statistical analyses to solve complex business problems. Utilize programming languages such as Python, R, and SQL to manipulate data and build predictive models. Understand end-to-end data pipelines, including data collection, cleaning, transformation, and visualization. Collaborate with IT and data engineering teams to integrate analytics solutions into production environments. Provide thought leadership on solutions and metrics based on the understanding of nature of business requirement. Team Leadership & Development Lead, mentor, and manage a team of data scientists and analysts, fostering a collaborative and innovative environment. Provide guidance on career development, performance evaluations, and skill enhancement. Promote continuous learning and adoption of best practices in data science methodologies. Engage and manage a hierarchical team while fostering a culture of collaboration. Strategic Planning & Execution Collaborate with senior leadership to define the data science strategy aligned with business objectives. Identify and prioritize high-impact analytics projects that drive business value. Ensure the timely delivery of analytics solutions, balancing quality, scope, and resource constraints. Client Engagement & Stakeholder Management Serve as the primary point of contact for clients, understanding their business challenges and translating them into data science solutions. Lead client presentations, workshops, and discussions to communicate complex analytical concepts in an accessible manner. Develop and maintain strong relationships with key client stakeholders, ensuring satisfaction and identifying opportunities for further collaboration. Manage client expectations, timelines, and deliverables, ensuring alignment with business objectives. Develop and deliver regular reports and dashboards to senior management, market stakeholders and clients highlighting key insights and performance metrics. Act as a liaison between technical teams and business units to align analytics initiatives with organizational goals. Cross-Functional Collaboration Work closely with cross capability teams such as Business Intelligence, Market Analytics, Data engineering to integrate analytics solutions into business processes. Translate complex data insights into actionable recommendations for non-technical stakeholders. Facilitate workshops and presentations to promote data driven conversations across the organization. Closely working with support functions to provide timely updates to leadership on operational metrics. Governance & Compliance Ensure adherence to data governance policies, including data privacy regulations (e.g., GDPR, PDPA). Implement best practices for data quality, security, and ethical use of analytics. Stay informed about industry trends and regulatory changes impacting data analytics. Qualifications Education: Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. Experience: 12+ years of experience in advanced analytics, data science, data modelling, machine learning, Generative AI or a related field with 5+ years in a leadership capacity. Proven track record of managing and delivering complex analytics projects. Familiarity with the BFSI/Hi Tech/Retail/Healthcare industry and experience with product, transaction, and customer-level data Experience with media data will be advantageous Technical Skills: Proficiency in programming languages like Python, R, or SQL. Experience with data visualization tools (e.g., Tableau, Power BI). Familiarity with big data platforms (e.g., Hadoop, Spark) and cloud services (e.g., AWS, GCP, Azure). Knowledge of machine learning frameworks and libraries. Soft Skills: Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. Ability to influence and drive change within the organization. Strategic thinker with a focus on delivering business outcomes. Desirable Attributes Proficient in the following advanced analytics techniques ( Should have proficiency in most) Descriptive Analytics: Statistical analysis, data visualization. Predictive Analytics: Regression analysis, time series forecasting, classification techniques, market mix modelling Prescriptive Analytics: Optimization, simulation modelling. Text Analytics: Natural Language Processing (NLP), sentiment analysis. Extensive knowledge of machine learning techniques, including ( Should have proficiency in most ) Supervised Learning: Linear regression, logistic regression, decision trees, support vector machines, random forests, gradient boosting machines among others Unsupervised Learning: K-means clustering, hierarchical clustering, principal component analysis (PCA), anomaly detection among others Reinforcement Learning: Q-learning, deep Q-networks, etc. Experience with Generative AI and large language models (LLMs) for text generation, summarization, and conversational agents ( Good to Have ) Researching, loading and application of the best LLMs (GPT, Gemini, LLAMA, etc.) for various objectives Hyper parameter tuning Prompt Engineering Embedding & Vectorization Fine tuning Proficiency in data visualization tools such as Tableau or Power BI ( Good to Have ) Strong skills in data management, structuring, and harmonization to support analytical needs (Must have) Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 6 days ago
6.0 - 8.0 years
12 - 18 Lacs
India
Remote
Senior Data Analyst Job Location: Remote Experience Required: 6-8 years Working Days: Monday to Friday Budget: Up to 18LPA Job Specification Qualification- Bachelor's or Master’s degree in Computer Science or Engineering Mathematics, Industrial Engineering or Management. #Knowledge (Certification - Technical, Product, Industry, etc.)- Insurance Financial Services, Insurance/Finance/banking domain experience is a Must Skills critical to job success/Must Have Skills -● Must be expert in Power BI reports development. ● Database management system programming (e.g. Oracle, Microsoft SQL Server) ● User interface and query software ● Agile methodologies ● Predictive modeling, NLP and text analysis ● Data modeling tools (e.g. ERWin, Enterprise Architect and Visio) ● Data mining ● ETL tools ● UNIX, Linux, Solaris and MS Windows ● Hadoop and NoSQL databases ● Data visualization ● Should have experience in Insurance Domain ● Should have machine learning experience Job Summary As a Data Analyst you will be responsible for turning data into information, information into insight, and insight into business decisions. You will conduct full lifecycle analysis to include requirements, activities, and design. Data analysts will develop analysis and reporting capabilities and will also monitor performance and quality control plans to identify improvements. Responsibilities: ● Data Transformation: Convert raw data into meaningful information that can guide businesss trategies. ● Life Cycle Analysis: Manage the entire lifecycle of data analysis, from gathering requirements to activity coordination and design implementation. ● Develop reports and refine analysis and reporting tools to provide clear insights into business performance. ● Continuously monitor and assess performance metrics to ensure optimal operation and identify areas for improvement. ● Implement and oversee quality control measures to maintain the integrity and accuracy of data analysis. ● Synthesize complex data setsto extract key trends and insights that drive decision-making processes. ● Work closely with cross-functional teams to prioritize data and analytics needs and support data-driven decisions. ● Proactively seek out and recommend process enhancements to streamline data collection and analysis procedures ● Constantly monitor, refine and report on the performance of data management systems. ● Maintain a corporate repository of all data analysis artifacts and procedures. ● Perform other functions as may be assigned. Job Types: Full-time, Permanent Pay: ₹1,200,000.00 - ₹1,800,000.00 per year Benefits: Provident Fund Schedule: Day shift Weekend availability Supplemental Pay: Performance bonus Experience: Data Analyst: 7 years (Required) Work Location: In person
Posted 6 days ago
0.0 - 2.0 years
0 Lacs
Pune
On-site
The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Essential: Experience primarily with SAP Business Objects and Talend ETL Experience with any one other Business Intelligence tools like Tableau/COGNOS, ETL tools - Abinitio/Spark and unix shell scripting Experience in RDBMS, preferably Oracle with SQL query writing skills. Good understating of Data-warehousing concepts like Schema, Facts/Dimensions. Should be able to understand and modify complex universe queries, design and use the functionalities of Web Intelligence tool. Familiarity with identification and resolution of data quality issues. Strong and effective inter-personal and communication skills and the ability to interact professionally with a business user. Great team player with a passion to collaborate with colleagues. Knowledge of any application server (Weblogic, WAS, Tomcat etc) Adjacent Skills: Apache Spark with java Good Understanding of Bigdata and Hadoop ecosystem Good Understanding of Hive and Impala Testing frameworks (test driven development) Good communication skills Knowledge of Maven, Python scripting skills Good problem solving skills Beneficial: EMS, Kafka, Domain Knowledge - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 6 days ago
7.0 years
3 - 9 Lacs
Bengaluru
On-site
Bangalore,Karnataka,India Job ID 766481 Join our Team About this Opportunity The complexity of running and optimizing the next generation of wireless networks, such as 5G with distributed edge compute, will require Machine Learning (ML) and Artificial Intelligence (AI) technologies. Ericsson is setting up an AI Accelerator Hub in India to fast-track our strategy execution, using Machine Intelligence (MI) to drive thought leadership, automate, and transform Ericsson’s offerings and operations. We collaborate with academia and industry to develop state-of-the-art solutions that simplify and automate processes, creating new value through data insights. What you will do As a Senior Data Scientist, you will apply your knowledge of data science and ML tools backed with strong programming skills to solve real-world problems. Responsibilities: 1. Lead AI/ML features/capabilities in product/business areas 2. Define business metrics of success for AI/ML projects and translate them into model metrics 3. Lead end-to-end development and deployment of Generative AI solutions for enterprise use cases 4. Design and implement architectures for vector search, embedding models, and RAG systems 5. Fine-tune and evaluate large language models (LLMs) for domain-specific tasks 6. Collaborate with stakeholders to translate vague problems into concrete Generative AI use cases 7. Develop and deploy generative AI solutions using AWS services such as SageMaker, Bedrock, and other AWS AI tools. Provide technical expertise and guidance on implementing GenAI models and best practices within the AWS ecosystem. 8. Develop secure, scalable, and production-grade AI pipelines 9. Ensure ethical and responsible AI practices 10. Mentor junior team members in GenAI frameworks and best practices 11. Stay current with research and industry trends in Generative AI and apply cutting-edge techniques 12. Contribute to internal AI governance, tooling frameworks, and reusable components 13. Work with large datasets including petabytes of 4G/5G networks and IoT data 14. Propose/select/test predictive models and other ML systems 15. Define visualization and dashboarding requirements with business stakeholders 16. Build proof-of-concepts for business opportunities using AI/ML 17. Lead functional and technical analysis to define AI/ML-driven business opportunities 18. Work with multiple data sources and apply the right feature engineering to AI models 19. Lead studies and creative usage of new/existing data sources What you will bring Required Experience - min 7 years 1. Bachelors/Masters/Ph.D. in Computer Science, Data Science, AI, ML, Electrical Engineering, or related disciplines from reputed institutes 2. 3+ years of applied ML/AI production-level experience 3. Strong programming skills (R/Python) 4. Proven ability to lead AI/ML projects end-to-end 5. Strong grounding in mathematics, probability, and statistics 6. Hands-on experience with data analysis, visualization techniques, and ML frameworks (Python, R, H2O, Keras, TensorFlow, Spark ML) 7. Experience with semi-structured/unstructured data for AI/ML models 8. Strong understanding of building AI models using Deep Neural Networks 9. Experience with Big Data technologies (Hadoop, Cassandra) 10. Ability to source and combine data from multiple sources for ML models Preferred Qualifications: 1. Good communication skills in English 2. Certifying MI MOOCs, a plus 3. Domain knowledge in Telecommunication/IoT, a plus 4. Experience with data visualization and dashboard creation, a plus 5. Knowledge of Cognitive models, a plus 6. Experience in partnering and collaborative co-creation in a global matrix organization. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?
Posted 6 days ago
0 years
9 Lacs
Bengaluru
On-site
Associate - Production Support Engineer Job ID: R0388741 Full/Part-Time: Full-time Regular/Temporary: Regular Listed: 2025-06-12 Location: Bangalore Position Overview Job Title: Associate - Production Support Engineer Location: Bangalore, India Role Description You will be operating within Corporate Bank Production as an Associate, Production Support Engineer in the Corporate Banking subdivisions. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery plus, using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we’ll offer you As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of the run book. Drive knowledge management across the supported applications and ensure full compliance Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your skills and experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Unix, Shell Scripting and/or Python SQL Stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters ITIL v3 Certified (must) Control-M, CRON scheduling MQ- DBUS, IBM JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Data Streaming – Kafka (Experience with Confluent flavor a plus) and ZooKeeper Hadoop framework Configuration Mgmt Tooling: Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) APM Tooling: either or one of Splunk AppDynamics Geneos NewRelic Other platforms: Scheduling – Ctrl-M is a plus, Autosys, etc Search – Elastic Search and/or Solr+ is a plus Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Familiarity with TDD and/or BDD Distributed systems Experience on cloud platforms such as Azure, GCP is a plus Familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT IntelliJ SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in Follow the Sun model, virtual teams and in matrix structure Service Operations experience within a global operations context 6-9 yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of run-book execution Experience of supporting complex application and infrastructure domains Good analytical, troubleshooting and problem-solving skills Working knowledge of incident tracking tools (i.e., Remedy, Heat etc.) How we’ll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 6 days ago
0 years
0 Lacs
Bengaluru
On-site
About PhonePe Group: PhonePe is India’s leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! About the Role: We are looking for a motivated and curious Site Reliability Engineering (SRE) Intern to join our infrastructure team. This internship provides hands-on experience in building, maintaining and scaling production systems, focusing on reliability, performance, and automation. You will work closely with experienced SRE to improve system observability, automate operational processes and ensure service uptime. Interns will engage in real-world infrastructure projects spread over couple of months, with the potential for a full-time SRE role based on performance and organizational requirements Eligible Candidates: Students graduated in 2024 /2025. Roles and Responsibilities: Assist in maintaining and monitoring production infrastructure and services. Automate routine tasks using scripting languages such as Python, Perl. Help design and implement monitoring, alerting, and logging systems. Collaborate with engineering teams to ensure reliability and scalability of services. Participate in incident response and postmortem documentation. Document processes and contribute to improving internal tools and dashboards. Skill Required: Currently pursuing a degree in Computer Science or related field. Basic understanding of Linux systems and shell scripting. Familiarity with cloud platforms like AWS, GCP or Azure is a plus. Good problem-solving skills and eagerness to learn. Good to Have: Familiarity with Distributed systems like Hadoop, Elasticsearch, Kafka, Gluster Familiar with Container orchestration stack. Experience with version control systems such as Git. Familiarity with infrastructure as code tools like Terraform or Ansible. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog .
Posted 6 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Summary The Senior NLP Data Scientist will work in teams addressing natural language processing, large language models (LLMs), and agentic AI problems in a commercial technology and consultancy development environment. In this role, you will contribute to the development and deployment of advanced NLP techniques, large language models, agentic AI systems, and semantic analysis methods for extracting and understanding structure in large text data sets. Job Description Site Overview Established in 2000, the John F. Welch Technology Center (JFWTC) in Bengaluru is our multidisciplinary research and engineering center. Engineers and scientists at JFWTC have contributed to hundreds of aviation patents, pioneering breakthroughs in engine technologies, advanced materials, and additive manufacturing. Role Overview Develop and implement state-of-the-art NLP models and algorithms, with a focus on large language models (LLMs) and agentic AI. Collaborate with cross-functional teams to identify and solve complex NLP problems in various domains. Design and conduct experiments to evaluate the performance of NLP models and improve their accuracy and efficiency. Deploy NLP models into production environments, ensuring scalability and robustness. Analyze large text data sets to uncover insights and patterns using advanced statistical and machine learning techniques. Stay up-to-date with the latest research and advancements in NLP, LLMs, and agentic AI, and apply this knowledge to ongoing projects. Communicate findings and recommendations to stakeholders through clear and concise reports and presentations. Mentor and guide junior data scientists and engineers in best practices for NLP and machine learning. Develop and maintain documentation for NLP models, algorithms, and processes. Collaborate with product managers and engineers to integrate NLP solutions into products and services. Conduct code reviews and ensure adherence to coding standards and best practices. Participate in the development of data collection and annotation strategies to improve model performance. Contribute to the development of intellectual property, including patents and publications, in the field of NLP and AI The Ideal Candidate Ideal candidate should have experience in Image Analytics, Computer Vision, Python and cloud platforms Required Qualifications Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with 5+ experience in data science Demonstrated skill in the use of Python and / or other analytic software tools or languages Demonstrated skill in guiding teams to solve business problems Strong communication, interpersonal and leadership skills Preferred Qualification Proven experience in developing and deploying NLP models, particularly large language models (LLMs) and agentic AI systems. Strong programming skills in Python and familiarity with NLP libraries and frameworks such as TensorFlow, PyTorch, Hugging Face Transformers, Langchain, Langgraph and spaCy. Experience with cloud platforms and tools for deploying machine learning models (e.g., AWS, GCP, Azure). Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Experience with transfer learning, fine-tuning, and prompt engineering for LLMs. Knowledge of agentic AI principles and their application in real-world scenarios. Familiarity with big data technologies and tools such as Hadoop, Spark, and SQL. Publications or contributions to the NLP and AI research community. At GE Aerospace, we have a relentless dedication to the future of safe and more sustainable flight and believe in our talented people to make it happen. Here, you will have the opportunity to work on really cool things with really smart and collaborative people. Together, we will mobilize a new era of growth in aerospace and defense. Where others stop, we accelerate. Additional Information Relocation Assistance Provided: Yes Show more Show less
Posted 6 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering, ETL. Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment. Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions Create Proof of concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects Influence machine learning strategy for Digital programs and projects Make solution recommendations that appropriately balance speed to market and analytical soundness Explore design options to assess efficiency and impact, develop approaches to improve robustness and rigor Develop analytical / modelling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow) Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Create algorithms to extract information from large, multiparametric data sets Deploy algorithms to production to identify actionable insights from large databases Compare results from various methodologies and recommend optimal techniques Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Develop and embed automated processes for predictive model validation, deployment, and implementation Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment Lead discussions at peer review and use interpersonal skills to positively influence decision making Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices Facilitate cross-geography sharing of new ideas, learnings, and best-practices Requirements Bachelor of Science or Bachelor of Engineering at a minimum. 4+ years of work experience as a Data Scientist A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) Good hands-on skills in both feature engineering and hyperparameter optimization Experience producing high-quality code, tests, documentation Experience with Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Azure ML, Synapse, Databricks Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and / or deep learning methodologies Proficiency in statistical concepts and ML algorithms Good knowledge of Agile principles and process Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results Self-motivated and a proactive problem solver who can work independently and in teams Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Senior Software Engineer Department: IDP About Us HG Insights is the global leader in technology intelligence, delivering actionable AI driven insights through advanced data science and scalable big data solutions. Our Big Data Insights Platform processes billions of unstructured documents and powers a vast data lake, enabling enterprises to make strategic, data-driven decisions. Join our team to solve complex data challenges at scale and shape the future of B2B intelligence. What You’ll Do: Design, build, and optimize large-scale distributed data pipelines for processing billions of unstructured documents using Databricks, Apache Spark, and cloud-native big data tools Architect and scale enterprise-grade big-data systems, including data lakes, ETL/ELT workflows, and syndication platforms for customer-facing Insights-as-a-Service (InaaS) products. Collaborate with product teams to develop features across databases, backend services, and frontend UIs that expose actionable intelligence from complex datasets. Implement cutting-edge solutions for data ingestion, transformation, and analytics using Hadoop/Spark ecosystems, Elasticsearch, and cloud services (AWS EC2, S3, EMR). Drive system reliability through automation, CI/CD pipelines (Docker, Kubernetes, Terraform), and infrastructure-as-code practices. What You’ll Be Responsible For Leading the development of our Big Data Insights Platform, ensuring scalability, performance, and cost-efficiency across distributed systems. Mentoring engineers, conducting code reviews, and establishing best practices for Spark optimization, data modeling, and cluster resource management. Building & Troubleshooting complex data pipeline issues, including performance tuning of Spark jobs, query optimization, and data quality enforcement. Collaborating in agile workflows (daily stand-ups, sprint planning) to deliver features rapidly while maintaining system stability. Ensuring security and compliance across data workflows, including access controls, encryption, and governance policies. What You’ll Need BS/MS/Ph.D. in Computer Science or related field, with 5+ years of experience building production-grade big data systems. Expertise in Scala/Java for Spark development, including optimization of batch/streaming jobs and debugging distributed workflows. Proven track record with: Databricks, Hadoop/Spark ecosystems, and SQL/NoSQL databases (MySQL, Elasticsearch). Cloud platforms (AWS EC2, S3, EMR) and infrastructure-as-code tools (Terraform, Kubernetes). RESTful APIs, microservices architectures, and CI/CD automation37. Leadership experience as a technical lead, including mentoring engineers and driving architectural decisions. Strong understanding of agile practices, distributed computing principles, and data lake architectures. Airflow orchestration (DAGs, operators, sensors) and integration with Spark/Databricks 7+ years of designing, modeling and building big data pipelines in an enterprise work setting. Nice-to-Haves Experience with machine learning pipelines (Spark MLlib, Databricks ML) for predictive analytics. Knowledge of data governance frameworks and compliance standards (GDPR, CCPA). Contributions to open-source big data projects or published technical blogs/papers. DevOps proficiency in monitoring tools (Prometheus, Grafana) and serverless architectures. Show more Show less
Posted 6 days ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
As a Senior Manager for data science, data modelling & Analytics, you will lead a team of data scientists and analysts while actively contributing to the development and implementation of advanced analytics solutions. This role requires a blend of strategic leadership and hands-on technical expertise to drive data-driven decision-making across the organization. Job Description: Key Responsibilities Hands-On Technical Contribution Design, develop, and deploy advanced machine learning models and statistical analyses to solve complex business problems. Utilize programming languages such as Python, R, and SQL to manipulate data and build predictive models. Understand end-to-end data pipelines, including data collection, cleaning, transformation, and visualization. Collaborate with IT and data engineering teams to integrate analytics solutions into production environments. Provide thought leadership on solutions and metrics based on the understanding of nature of business requirement. Team Leadership & Development Lead, mentor, and manage a team of data scientists and analysts, fostering a collaborative and innovative environment. Provide guidance on career development, performance evaluations, and skill enhancement. Promote continuous learning and adoption of best practices in data science methodologies. Engage and manage a hierarchical team while fostering a culture of collaboration. Strategic Planning & Execution Collaborate with senior leadership to define the data science strategy aligned with business objectives. Identify and prioritize high-impact analytics projects that drive business value. Ensure the timely delivery of analytics solutions, balancing quality, scope, and resource constraints. Client Engagement & Stakeholder Management Serve as the primary point of contact for clients, understanding their business challenges and translating them into data science solutions. Lead client presentations, workshops, and discussions to communicate complex analytical concepts in an accessible manner. Develop and maintain strong relationships with key client stakeholders, ensuring satisfaction and identifying opportunities for further collaboration. Manage client expectations, timelines, and deliverables, ensuring alignment with business objectives. Develop and deliver regular reports and dashboards to senior management, market stakeholders and clients highlighting key insights and performance metrics. Act as a liaison between technical teams and business units to align analytics initiatives with organizational goals. Cross-Functional Collaboration Work closely with cross capability teams such as Business Intelligence, Market Analytics, Data engineering to integrate analytics solutions into business processes. Translate complex data insights into actionable recommendations for non-technical stakeholders. Facilitate workshops and presentations to promote data driven conversations across the organization. Closely working with support functions to provide timely updates to leadership on operational metrics. Governance & Compliance Ensure adherence to data governance policies, including data privacy regulations (e.g., GDPR, PDPA). Implement best practices for data quality, security, and ethical use of analytics. Stay informed about industry trends and regulatory changes impacting data analytics. Qualifications Education: Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. Experience: 12+ years of experience in advanced analytics, data science, data modelling, machine learning, Generative AI or a related field with 5+ years in a leadership capacity. Proven track record of managing and delivering complex analytics projects. Familiarity with the BFSI/Hi Tech/Retail/Healthcare industry and experience with product, transaction, and customer-level data Experience with media data will be advantageous Technical Skills: Proficiency in programming languages like Python, R, or SQL. Experience with data visualization tools (e.g., Tableau, Power BI). Familiarity with big data platforms (e.g., Hadoop, Spark) and cloud services (e.g., AWS, GCP, Azure). Knowledge of machine learning frameworks and libraries. Soft Skills: Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. Ability to influence and drive change within the organization. Strategic thinker with a focus on delivering business outcomes. Desirable Attributes Proficient in the following advanced analytics techniques ( Should have proficiency in most) Descriptive Analytics: Statistical analysis, data visualization. Predictive Analytics: Regression analysis, time series forecasting, classification techniques, market mix modelling Prescriptive Analytics: Optimization, simulation modelling. Text Analytics: Natural Language Processing (NLP), sentiment analysis. Extensive knowledge of machine learning techniques, including ( Should have proficiency in most ) Supervised Learning: Linear regression, logistic regression, decision trees, support vector machines, random forests, gradient boosting machines among others Unsupervised Learning: K-means clustering, hierarchical clustering, principal component analysis (PCA), anomaly detection among others Reinforcement Learning: Q-learning, deep Q-networks, etc. Experience with Generative AI and large language models (LLMs) for text generation, summarization, and conversational agents ( Good to Have ) Researching, loading and application of the best LLMs (GPT, Gemini, LLAMA, etc.) for various objectives Hyper parameter tuning Prompt Engineering Embedding & Vectorization Fine tuning Proficiency in data visualization tools such as Tableau or Power BI ( Good to Have ) Strong skills in data management, structuring, and harmonization to support analytical needs (Must have) Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 6 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions Create Proof of concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects Influence machine learning strategy for Digital programs and projects Make solution recommendations that appropriately balance speed to market and analytical soundness Explore design options to assess efficiency and impact, develop approaches to improve robustness and rigor Develop analytical / modelling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow) Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Create algorithms to extract information from large, multiparametric data sets Deploy algorithms to production to identify actionable insights from large databases Compare results from various methodologies and recommend optimal techniques Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories Develop and embed automated processes for predictive model validation, deployment, and implementation Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment Lead discussions at peer review and use interpersonal skills to positively influence decision making Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices Facilitate cross-geography sharing of new ideas, learnings, and best-practices Requirements Bachelor of Science or Bachelor of Engineering at a minimum. 4+ years of work experience as a Data Scientist A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) Good hands-on skills in both feature engineering and hyperparameter optimization Experience producing high-quality code, tests, documentation Experience with Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Azure ML, Synapse, Databricks Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and / or deep learning methodologies Proficiency in statistical concepts and ML algorithms Good knowledge of Agile principles and process Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results Self-motivated and a proactive problem solver who can work independently and in teams Show more Show less
Posted 6 days ago
10.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Immediate/Early Joiners Preferred We’re seeking a seasoned Data Analytics Engineer (10+ years experience) to lead data architecture, analytics, and visualization initiatives for our client. This role involves building scalable data pipelines, transforming large datasets, and delivering actionable insights through BI tools. Key Responsibilities Design and maintain ETL pipelines , data models , and architectures . Analyze large-scale data using SQL and Python . Create dashboards and visualizations in Power BI or Tableau . Work with big data (Spark, Hadoop) and cloud platforms (AWS, Azure, or GCP). Manage structured and unstructured data (SQL, NoSQL: Cassandra/MongoDB). Collaborate with cross-functional teams to deliver data-driven solutions. Document systems and ensure performance monitoring and data integrity. Requirements 10+ years of experience, including 5+ years in data analytics and 5+ in data engineering . Proficient in SQL, Python, ETL , and data modeling . Hands-on with BI tools , big data tech, and cloud environments. Strong communication, problem-solving, and stakeholder engagement skills. Degree in Computer Science, Data Science, or related field (Master’s preferred). You may also share your resume with us at cv@refrelay.com. Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Company Description It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today — ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone. Job Description Who are the Account Escalation Analyst Team? A global group of highly skilled engineers working alongside Account Escalation Managers to tackle the most complex technical issues faced by our customers. With senior-level visibility both internally and within client organizations this team is crucial in maximising customer satisfaction, maintaining product success, and delivering innovation whilst providing world-class service. We are seeking skilled recruits in India to help focus on the identification and diagnosis of emerging performance issues before they become customer-impacting and to operate as part of the wider team to ensure timely resolution to customer-impacting issues. As a Senior Performance Support Specialist you will... Successfully diagnose the entire technology stack, from the front-end to the back end, to determine where to start troubleshooting an issue. Interpret technical data to identify trends and resolve system bottlenecks Work on performance tuning of our web-based applications Liaise with clients directly and deliver great customer service instilling trust from our clients. Qualifications To be successful in this role you have: In order to be successful in this role, we need someone who has: We are looking for people with these skills to be Senior Performance Support Specialists... Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry. Successful candidates will have at least 5 years’ experience in a Technical Support or similar role. In addition, the ideal candidate will be someone who has: Demonstrated the ability to troubleshoot complex technical issues Strong Experience with relational databases (e.g. MySQL, Oracle) Java experience Experience in one (or more) scripting languages: JavaScript, Python, Perl, Unix Shell, Windows Shell) Advanced Unix/Linux experience Working knowledge of the components in a web applications stack. Experience diagnosing performance degradation (e.g. explain plans, database tuning) Experience working well in a team environment while also being able to work productively while unsupervised Nice to have skills: Statistical modelling Data analysis Experience of using BigData/Hadoop Machine Learning This is a critical role for the company where you receive full product training and interact with some of our biggest accounts. JV20 Additional Information Work Personas We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work. Learn more here. Equal Opportunity Employer ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements. Accommodations We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact globaltalentss@servicenow.com for assistance. Export Control Regulations For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities. From Fortune. ©2024 Fortune Media IP Limited. All rights reserved. Used under license. Show more Show less
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.
These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.
The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.
In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.
In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.
As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16869 Jobs | Dublin
Wipro
9024 Jobs | Bengaluru
EY
7266 Jobs | London
Amazon
5652 Jobs | Seattle,WA
Uplers
5629 Jobs | Ahmedabad
IBM
5547 Jobs | Armonk
Oracle
5387 Jobs | Redwood City
Accenture in India
5156 Jobs | Dublin 2
Capgemini
3242 Jobs | Paris,France
Tata Consultancy Services
3099 Jobs | Thane