Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Position Summary... As a Data Engineer III at Walmart, you will be responsible for designing, building, and maintaining large-scale data systems and architectures to support business intelligence and analytics initiatives. efficiency. What youll do... About the team: The Walmart Last Mile Data team is a dynamic and innovative group within the Data and Customer Analytics organization, responsible for developing and maintaining data solutions that drive business growth and efficiency in Walmarts last mile delivery operations. As a Data Engineer III, you will be part of a collaborative team that leverages cutting-edge technologies and data analytics to optimize delivery routes, improve customer satisfaction, and reduce cost. What You ll Do As a Data Engineer, you will play a critical role in designing, developing, and implementing data pipelines and data integration solutions using Spark, Scala, Python, Airflow and Google Cloud Platform (GCP). You will be responsible for building scalable and efficient data processing systems, optimizing data workflows, and ensuring data quality and integrity. Monitor and troubleshoot data pipelines to ensure data availability and reliability Conduct performance tuning and optimization of data processing systems for improved efficiency and scalability Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Work closely with data scientists and analysts to provide them with the necessary data sets and tools for analysis and reporting Create data tools for analytics team members that assist them in building and optimizing our product into an innovative industry leader. Stay up-to-date with the latest industry trends and technologies in data engineering and apply them to enhance the data infrastructure What You ll bring Proven working experience as a Data Engineer with a minimum of 5 years in the field. Strong programming skills in Scala and experience with Spark for data processing and analytics Familiarity with Google Cloud Platform (GCP) services such as BigQuery, GCS, Dataproc etc. Experience of developing near real-time ingestion pipelines using kafka and spark structured streaming. Experience with Data modelling, Data warehousing and ETL processes Understanding of data warehousing concepts and best practices Strong knowledge of SQL and NoSQL systems Proficiency in version control systems, particularly Git. Proficiency in working with large-scale data sets and distributed computing frameworks Familiarity with CI/CD pipelines and tools such as Jenkins or GitLab CI. Familiarity with schedulers like Airflow. Strong problem-solving and analytical skills Familiarity with BI and Visualisation tools like Tableau or Looker A background in Generative Artificial Intelligence (Gen AI) is desirable but not essential About Walmart Global Tech . . Flexible, hybrid work . Benefits . Belonging . . Minimum Qualifications... Minimum Qualifications:Option 1: Bachelors degree in Computer Science and 2 years experience in software engineering or related field. Option 2: 4 years experience in software engineering or related field. Option 3: Masters degree in Computer Science. Preferred Qualifications... Primary Location...
Posted 3 weeks ago
6.0 - 11.0 years
20 - 25 Lacs
Bengaluru
Work from Office
At Branch, we re transforming how brands and users interact across digital platforms. Our mobile marketing and deep linking solutions are trusted to deliver seamless experiences that increase ROI, decrease wasted spend, and eliminate siloed attribution. Our Branch team consists of smart, humble, and collaborative people who value ownership over all. Everything we do is centered around creating a great product, team, and company that lives and breathes our motto: Build Together, Grow Together, Win Together. As a Senior Data Engineer, you will design, build, and manage components of our highly available real-time and batch data pipelines handling petabytes of data. Our pipeline platform is designed to ingest and process billions of events per day, and make the resulting aggregations and insights available within minutes in our analytical data stores. Data Analytics is at the core of our business, and we re constantly innovating to make our systems more performant, timely, cost-effective, and capable while maintaining high reliability. You will build on top of our core data infrastructure and pipelines using technologies and tools tailored for massive data sets including Flink, Spark, Kafka, Iceberg and Druid while working in the AWS cloud environment. If you are interested in building systems that can consume and explore billions of data points a day, work with petabytes of data, and want to push what is possible with data, this is the place for you! As a Senior Data Engineer, you ll get to: Architect, build, and own real-time and batch data aggregation systems to deliver quality analytical reports for our internal and external customers. Collaborate with Data Scientists, Backend Engineers, Data Infrastructure Operations, and Products Managers to deliver new features and capabilities for customers. Develop clean, safe, testable, and cost-efficient solutions. Make well-informed decisions with deep knowledge of both the internal and external impacts on teams and projects. Foresee shortcomings ahead of time and be able to drive to resolution. You ll be a good fit if you have: Bachelors in CS or equivalent. 6+ years of Software Engineering or Data Engineering experience with a recent focus on big data. Strong development skills in Java or Scala and Python. Solid background in the fundamentals of computer science, distributed systems, large scale data processing as well as database schema design and data warehousing. Practical experience managing AWS or Google Cloud environments. Experience in containerized deployment or Kubernetes is a big plus! Good understanding of a broad spectrum of NoSQL, traditional RDBMS, and analytical/columnar data stores including Postgres, Druid, Vertica, Redshift, Hadoop, Hive, Cassandra, Aerospike, and Redis. Ability to build systems that balance scalability, availability, and latency. Strong ability to advocate for the continual deployment and automation tools, monitoring, and self-healing systems that can help improve the lives of our engineers. Great communication skills and you are a team player who has a proven track record of building strong relationships with management, co-workers, and customers. A desire to learn and grow, push yourself and your team, share lessons with others and provide constructive and continuous feedback, and be receptive to feedback from. This role will be based at our Bengaluru, KA office and follows a Hybrid schedule that will be aligned with our Return to Office guidelines. The salary range provided represents base compensation and does not include potential equity, which is available for qualifying positions. At Branch, we are committed to the well-being of our team by offering a comprehensive benefits package. From health and wellness programs to paid time off and retirement planning options, we provide a range of benefits for qualified employees. For detailed information on the benefits specific to your position, please consult with your recruiter. Branch is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. If you think youd be a good fit for this role, wed love for you to apply! At Branch, we strive to create an inclusive culture that encourages people from all walks of life to bring their unique, diverse perspectives to work. We aim every day to build an environment that empowers us all to do the best work of our careers, and we cant wait to show you what we have to offer! A little bit about us: Branch is the leading provider of engagement and performance mobile SaaS solutions for growth-focused teams, trusted to maximize the value of their evolving digital strategies. The Branch platform provides a seamless experience across paid and organic, on all channels and platforms, online and offline, to eliminate friction and drive valuable action at the moments of highest intent. With Branch, businesses gain accurate mobile measurement and insights into user interactions, enabling them to drive conversions, engagement, and more intelligent marketing spend. Branch is an award-winning employer headquartered in Mountain View, CA. World-class brands like Instacart, Western Union, NBCUniversal, Zocdoc and Sephora acquire users, retain customers and drive more conversions with Branch. Candidate Privacy Information: For more information on the data that Branch will collect through your application, and how we use, share, delete, and retain that information as part of our recruitment and employment efforts, please see our HR Privacy Policy .
Posted 3 weeks ago
4.0 - 11.0 years
22 - 27 Lacs
Mumbai
Work from Office
The HERE Analytics group is looking for a Sr Software Engineer to build extensive ETL pipelines and strenghen the infrastructure of big data visualization tools to view complex large scale location attributes on a map. The job includes all parts of software development lifecycle: refining product vision, gathering requirements, software system design, coding, testing, release and support. As our team is spread around the globe, you will work collaboratively with other team members from different locations. Your daily activities will comprise of one or more of the following tasks: Tackle interesting and challenging problems of large-scale data extraction, transformation and enrichment Implement tools to enhance both automated and semi-automated map data processing, involving backend/service-based software stack and front-end visualization components for big data analysis
Posted 3 weeks ago
3.0 - 7.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks. Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights. Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks. Ensure data quality, integrity, and security throughout all stages of the data lifecycle. Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions. Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features. Provide technical guidance and expertise to junior data engineers and developers. Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering. Contribute to the continuous improvement of data engineering processes, tools, and best practices. Bachelor s or master s degree in computer science, engineering, or a related field. 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solutions. Strong
Posted 3 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Gurugram, Bengaluru
Work from Office
Educational Qualification: Essential: B.E. / B.Tech/ M. Tech./M.Sc. in any stream Desirable: Specialization in data science or statistics Role : Data Scientist Responsibilities: Data Solutions Architecture: Develop innovative data-driven solutions for business challenges using Telematics data. Collaborate with domain experts to gain automotive insights. IoT Device Mastery: Understand the Telematics Control Unit (TCU) and the time-series data it generates. Data Landscape Analysis: Evaluate data adequacy and establish a comprehensive understanding of the data landscape. Data Preparation: Clean and prepare datasets for modeling. Engage in ETL processes and apply data transformation techniques such as resampling, filtering, and encoding. Exploratory Data Analysis: Conduct exploratory data analysis to derive insights. Present descriptive statistics and insights to domain experts. Identify meaningful patterns, detect seasonality and trends, and establish cause-and-effect relationships. Feature Engineering: Design and select features, study feature importance, and decide on the machine learning strategy. Model Development: Select appropriate machine learning/deep learning models, set up data pipelines for model training, perform hyper-parameter tuning, validation, and testing. Apply ensemble modeling techniques if required. Reporting Visualization: Create comprehensive reports and visualize data using plots and heat maps. Utilize tools like Google Maps API for visualization. Technical Skills / Experience: Essential : Industry Experience: Minimum of 5 years in a data scientist role. Machine Learning Expertise: Experience with machine learning algorithms (e.g., Generalized Linear Models, Boosting, Decision Trees, Neural Networks, SVM, Bayesian Methods, time series models). Hands-on Experience: Proficiency in using machine learning models for regression, classification, and unsupervised learning algorithms. Knowledge of clustering techniques. Cloud Computing: Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. Programming Skills: Strong programming skills in Python, with experience using libraries like pandas, numpy, matplotlib, and sklearn. Data Visualization: Proficiency in data visualization techniques and tools. Desirable : Databricks Platform: Experience with Databricks for big data processing and machine learning. Distributed Computing: Experience with Spark or other distributed computing frameworks. AutoML Tools: Understanding of tools like AWS Sagemaker, Google AutoML, IBM AutoAI, and Databricks. Telematics Data Analytics: Experience in time-series/IoT data analytics, including data streaming from vehicle on-board IoT devices. Automotive Systems Knowledge: Exposure to automotive systems, basics of automobiles, and Controller Area Network (CAN) protocol. Remote Collaboration: Experience working with remote team members. Advanced Visualization Tools: Experience with data visualization tools such as Tableau and PowerBI. Google Maps API: Working experience with Google Maps API for creating heat maps Behavioral: Analytical Thinking: Strong analytical skills to interpret complex data and derive actionable insights. Problem-Solving: Ability to approach problems creatively and develop innovative solutions. Attention to Detail: Meticulous attention to detail to ensure data accuracy and model reliability. Communication: Excellent communication skills to convey technical information to non-technical stakeholders. Collaboration: Strong team player with the ability to work collaboratively in a cross-functional environment. Adaptability: Flexibility to adapt to changing business needs and evolving technologies. Curiosity: A proactive learner with a curiosity to explore new technologies and methodologies. Time Management: Effective time management skills to handle multiple projects and meet deadlines
Posted 3 weeks ago
2.0 - 4.0 years
14 - 16 Lacs
Mumbai
Work from Office
Job Title: Sr. Software Engineer Job Code: 9586 Country: IN City: Mumbai Skill Category: IT\Technology Description: Department overview: Wholesale Data Services and Operations Technology supports two major functions within Nomura It provides foundational architectural solutions for market and reference data that powers Nomura s business. This team is an integral part of Nomura that is responsible for developing and maintaining systems that manage acquisition and distribution of data for the entire organization including front office, middle office, back office, risk, finance, and various AI based analytics systems. It is responsible for supporting trade settlement and transaction processing technology platform. This team provides daytoday support for the middle office, operations, and regulatory users, and manages vendor and inhouse applications. WO IT also owns the applications that provide trade settlement data services to other technology teams from Operations, Finance, Risk, Regulatory and Compliance. The selected person would be a part of AeJ Domestic Wholesale Data Services Operations Technology focusing on International Wealth Management business functions. The roles and responsibilities are as follows: Develop and maintain web applications to automate the Operations processes for IWM. Understand the existing Ops functions and participate in defining new solutions that meet their business objectives. Understand the overall ecosystem and develop appropriate integration solutions. Deliver high quality code within the committed deadlines. Adhere to the best coding practices that reduce technical debt. Expected to work with development and support team members across different regions. Partner with analysts across the globe to understand the requirements and to define solutions. Support resolution of production and user issues, application testing and maintenance releases. Partner with stakeholders for testing and implementation of the deliverables. Provide regular updates regarding status or progress made to managers and stakeholders. Gain understanding of various applications and systems being developed by the peer groups. Strong problem solving and analytical skills. Good written and verbal communication skills. Willing to learn new technologies / tools as required, to effectively deliver output. Key Skills: Mandatory Skills: 24 years of core Java programming experience Mastery of the Spring/ Spring Boot framework Good understanding of OOPS concepts, and design patterns as well as DB table design and normalization Strong problemsolving experience in a technical environment Experience with building low latency, large data processing systems. Familiarity with Java and web testing frameworks (e.g., Junit, Selenium) Understanding of memory management, multithreading concurrency and synchronization. Expertise in RESTful web services and microservices architecture Solid understanding of ORM frameworks (e.g., Hibernate) Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL) Version control systems, preferably Git Experience with build tools like Maven or Gradle Strong knowledge of software design patterns and principles (SOLID, DRY, etc.) Proficiency in unit testing and testdriven development (TDD) Experience with Agile methodologies Candidate should be a quick learner. Should have demonstrated in the past quick learning capabilities. Strong listening, problem solving, analytical skills and excellent communication skills (both spoken and written English) Must be a team player with prior experience in working in a global development team. Selfmotivated individual, quality and improvement focused. Desirable Skills: Working knowledge of Python Building and creating shell scripts Familiar with Linux commands and able to easily navigate, stop, start, and debug services. Experience in CI/CD working with pipelines, integrating pipelines with gitlab. Strong with git, branching strategies, peer review, commit clarity. Knowledge of containerization technologies (Docker, Kubernetes) Familiarity with frontend technologies (JavaScript, Angular, React, or Vue.js) Experience with NoSQL databases (MongoDB, Cassandra, etc.) Knowledge of messaging systems (Apache Kafka, RabbitMQ) Familiarity with CI/CD pipelines and tools (Jenkins, GitLab CI, etc.) Experience with performance tuning and optimization Understanding of security best practices in Java development Experience with GraphQL Knowledge of caching mechanisms (e.g., Redis, Memcached) Familiarity with logging and monitoring tools (ELK stack, Prometheus, Grafana) Experience with code quality tools (SonarQube, Checkstyle, etc.) Industry exposure of Investment banking We are committed to providing equal opportunities throughout employment including in the recruitment, training and development of employees. We prohibit discrimination in the workplace whether on grounds of gender, marital or domestic partnership status, pregnancy, carer s responsibilities, sexual orientation, gender identity, gender expression, race, color, national or ethnic origins, religious belief, disability or age. *Applying for this role does not amount to a job offer or create an obligation on Nomura to provide a job offer. The expression "Nomura" refers to Nomura Services India Private Limited together with its affiliates.
Posted 3 weeks ago
8.0 - 13.0 years
12 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Plans, conducts and leads assignments generally involving moderate, high budgets projects or more than one project. Manages user expectations regarding appropriate milestones and deadlines. Assists in training, work assignment and checking of less experienced developers. Serves as technical consultant to leaders in the IT organization and functional user groups. Subject matter expert in one or more technical programming specialties; employs expertise as a generalist of a specialist. Performs estimation efforts on complex projects and tracks progress. Works on the highest level of problems where analysis of situations or data requires an in-depth evaluation of various factors. Documents, evaluates and researches test results; documents evolution of testing scripts for future replication. Identifies, recommends and implements changes to enhance the effectiveness of quality assurance strategies. Description Comments Additional Details Description Comments : Data Quality Engineer -Skills: 8+ years of experience in Informatica Data Quality tool Proficiency in using Informatica Data Quality (IDQ) for data profiling to identify data anomalies and patterns. Ability to analyze data quality metrics and generate reports. Expertise in designing and implementing data cleansing routines using IDQ. Strong knowledge of SQL for querying databases. Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. Skills in standardizing, validating, and enriching data. Ability to create and manage data quality rules and workflows in IDQ. Experience in automating data quality checks and validations. Knowledge of integrating IDQ with other data management tools and platforms. Skills in managing data flows and ensuring data consistency. Experience with data manipulation libraries like Pandas and NumPy Expertise in using PySpark for big data processing and analytics. Strong SQL skills for querying and managing relational databases. Ability to write complex SQL queries for data extraction and transformation.Good to have: Familiarity with industry-standard data quality tools.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Collaborate with other teams to understand data requirements and deliver solutions. Design and implement data quality standards and measures using IDQ. Ensure data accuracy, reliability, and integrity. Conduct data profiling to identify data issues and anomalies Implement data cleansing routines to standardize and validate data. Create and manage data quality rules and workflows in IDQ. Automate data quality checks and validations. Develop and maintain scalable data pipelines using Python and PySpark Implement ETL processes to ensure accurate data processing. Not to Exceed Rate : (No Value)
Posted 3 weeks ago
8.0 - 13.0 years
12 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Plans, conducts and leads assignments generally involving moderate, high budgets projects or more than one project. Manages user expectations regarding appropriate milestones and deadlines. Assists in training, work assignment and checking of less experienced developers. Serves as technical consultant to leaders in the IT organization and functional user groups. Subject matter expert in one or more technical programming specialties; employs expertise as a generalist of a specialist. Performs estimation efforts on complex projects and tracks progress. Works on the highest level of problems where analysis of situations or data requires an in-depth evaluation of various factors. Documents, evaluates and researches test results; documents evolution of testing scripts for future replication. Identifies, recommends and implements changes to enhance the effectiveness of quality assurance strategies. Description Comments Additional Details Description Comments : PowerBI Developer -Skills: Power BI, Python, PySpark and SQL 8+ years of experience in Power BI Proficiency in designing data models that support reports and dashboards. Mastery of DAX for calculations and data manipulation. Skills in transforming raw data into a format suitable for analysis. Strong knowledge of SQL for querying databases. Ability to create visually appealing and interactive reports and dashboards. Proficiency in PySpark and Spark scripting Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Design, implement, and optimize reporting solutions using Power BI and SQL Utilize PySpark and Spark scripting for data processing and analysis Develop and maintain Power BI reports and dashboards. Analyze data to identify trends, patterns, and insights. Ensure data accuracy and consistency across reports. Not to Exceed Rate : (No Value)
Posted 3 weeks ago
8.0 - 13.0 years
12 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Plans, conducts and leads assignments generally involving moderate, high budgets projects or more than one project. Manages user expectations regarding appropriate milestones and deadlines. Assists in training, work assignment and checking of less experienced developers. Serves as technical consultant to leaders in the IT organization and functional user groups. Subject matter expert in one or more technical programming specialties; employs expertise as a generalist of a specialist. Performs estimation efforts on complex projects and tracks progress. Works on the highest level of problems where analysis of situations or data requires an in-depth evaluation of various factors. Documents, evaluates and researches test results; documents evolution of testing scripts for future replication. Identifies, recommends and implements changes to enhance the effectiveness of quality assurance strategies. Description Comments Additional Details Description Comments : Skills: Python, PySpark and SQL 8+ years of experience in Spark, Scala, PySpark for big data processing Proficiency in Python programming for data manipulation and analysis. Experience with Python libraries such as Pandas, NumPy. Knowledge of Spark architecture and components (RDDs, DataFrames, Spark SQL). Strong knowledge of SQL for querying databases. Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. Ability to write complex SQL queries for data extraction and transformation. Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues. Strong problem-solving skills to address data-related challenges Effective communication skills to collaborate with cross-functional teams.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Collaborate with other teams to understand data requirements and deliver solutions. Design, develop, and maintain scalable data pipelines using Python and PySpark. Utilize PySpark and Spark scripting for data processing and analysis Implement ETL (Extract, Transform, Load) processes to ensure data is accurately processed and stored. Develop and maintain Power BI reports and dashboards. Optimize data pipelines for performance and reliability. Integrate data from various sources into centralized data repositories. Ensure data quality and consistency across different data sets. Analyze large data sets to identify trends, patterns, and insights. Optimize PySpark applications for better performance and scalability. Continuously improve data processing workflows and infrastructure. Not to Exceed Rate : (No Value)
Posted 3 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
1. SharePoint Development and Customization: o Design, develop, and maintain SharePoint solutions using PowerShell, Graph API, C#, and SPFx. o Customize SharePoint sites to meet business requirements, including workflows, web parts, and site templates. 2. Microsoft 365 Integration: o Integrate SharePoint with other Microsoft 365 services, ensuring seamless collaboration and data flow. o Develop and maintain applications using Microsoft Graph API to enhance functionality within the M365 ecosystem. 3. Automation and Scripting: o Create and manage PowerShell scripts for automating SharePoint administrative tasks and deployments. o Implement automation solutions to improve efficiency and reduce manual intervention. 4. Web Development: o Develop web applications using ASP.Net and ensure they are integrated with SharePoint. o Utilize Python for backend development and data processing tasks. 5. Azure Integration: o Develop and deploy SharePoint solutions on Azure, leveraging cloud services for scalability and performance. o Manage Azure resources and ensure optimal performance of SharePoint applications hosted on the cloud. General Responsibilities: 1. Collaboration and Communication: o Work closely with stakeholders to gather requirements and provide technical solutions. o Collaborate with cross-functional teams to ensure successful project delivery. 2. Troubleshooting and Support: o Provide technical support and troubleshooting for SharePoint-related issues. o Ensure timely resolution of problems and maintain high availability of SharePoint services. 3. Documentation and Training: o Document development processes, configurations, and solutions. o Conduct training sessions for end-users and team members on SharePoint functionalities and best practices. Skills and Qualifications: Technical Skills: o Proficiency in PowerShell, Graph API, C#, ASP.Net, SPFx, Python, and Azure. o Strong understanding of SharePoint architecture and Microsoft 365 services. Experience: o 5+ years of experience in SharePoint development and customization. o Experience with multi-agent frameworks and cloud integration. Soft Skills: o Excellent problem-solving and analytical skills. o Strong communication and teamwork abilities. Number of Openings 1 ECMS ID in sourcing stage 527982 Duration of Contract 12 months Total Yrs. of Experience 10+ Relevant Yrs. of experience 6+ years Detailed JD (Roles and Responsibilities) Attached Mandatory skills SPFx Desired/ Secondary skills MultiAgent Framework, ASP.Net, SPFx, Python, Azure Domain Max Vendor Rate in Per Day (Currency in relevance to work location) 10975 INR per day(114 EUR per day) Delivery Anchor for tracking the sourcing statistics, technical evaluation, interviews, and feedback etc. Client Interview / F2F Applicable yes Work Location Bangalore Start date 1-Jun-2025 WFO/WFH/Hybrid WFO WFO BG Check (Pre/ Hybrid/ Post onboarding) Post Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO UK shift
Posted 3 weeks ago
3.0 - 5.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Key Responsibilities of this position include:Build and maintain robust scalable data processing pipelines in Microsoft AzureOptimize data ingestion process for better reliability and throughputDevelop subject matter expertise in the various data models / data sources required for advanced analytics and reportingConduct exploratory data analysis on data source to assess suitability for predictive modeling project objectsCurate datasets for Data Scientist partners for analysis machine learning and AI modelsAs needed, design and develop data models for analyticsDrive data architecture design decisions considering future growthEngage and communicate effectively across diverse disciplines (Business Unit Leaders, Corporate Underwriting and Actuarial) to collect data requirementsWork closely with data scientists to understand their objectives and translate to ensure architectural fitBuild and maintain robust scalable data processing pipelines in Microsoft AzureOptimize data ingestion processes for better reliability and throughput Qualifications Successful candidates should possess the following skills/capabilities:Bachelors degree (MS preferred) in Computer Science, Statistics, Math or equivalent combination of education and experience3 - 5 years of exper
Posted 3 weeks ago
4.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
ZEISS in India ZEISS in India is headquartered in Bengaluru and present in the fields of Industrial Quality Solutions, Research Microscopy Solutions, Medical Technology, Vision Care and Sports Cine Optics. ZEISS India has 3 production facilities, RD center, Global IT services and about 40 Sales Service offices in almost all Tier I and Tier II cities in India. With 2200+ employees and continued investments over 25 years in India, ZEISS success story in India is continuing at a rapid pace. Further information at ZEISS India . As a Data Platform Engineer, you will play a crucial role in developing, maintaining, and optimizing our data platform. Your expertise will drive the evolution of our platform, enabling seamless end-to-end transformation from raw data to insightful data products shared across the organization. You will collaborate closely with data product teams to understand their needs, challenges, and opportunities, ensuring the platform supports a wide range of data initiatives. You will also contribute to the architecture and design decisions of the platform, aligning with the principles of data mesh to promote domain-oriented decentralized data ownership and architecture. As a data platform engineer you are also expected to participate in continuous learning and training to stay updated on the latest trends and technologies in data platform development. A Graduate degree in Computer science, Software engineering with 4-6 years of experience as a Data Engineer or in a similar role. Strong proficiency in ETL, Data processing using SQL . Extensive experience working in the Databricks environment. Proven experience in data warehousing, data modeling, data architecture, and building data pipelines. Hands-on experience with Azure Data Factory. Excellent problem-solving skills and the ability to troubleshoot data-related issues. Strong communication and collaboration skills to work effectively with global teams. A proactive mindset and willingness to learn new technologies and methodologies. Experience with cloud platform- Microsoft Azure. Familiarity with other big data technologies and tools. Understanding of data governance and security best practices. Develop relationships and processes with finance, sales, business operations, Analytics and other cross-functional stakeholders Knowledge of Agile and DevOps practices is highly appreciated. Experience in FinOps is added advantage. Your ZEISS Recruiting Team: Itishree Pani
Posted 3 weeks ago
1.0 - 8.0 years
8 - 9 Lacs
Chennai
Work from Office
Review Program direction letter to establish program details, production dates and feature availability, rules for features availability summary codes and option packs Review the PDL and understand the change content requires feature rule Calibration Maintain ordering tables in line with specified market requirements and system rules adherence Liaising with Program, Scheduling, Plant and vehicle engineering teams to achieve launch data enablers Liaising with marketing teams on potential improvements to order guide process Hotline Support for program teams: Supporting expediting the process, incase of any critical requirements. Defining SAP HANA and its capabilities for PPM and codification business SAP HANA integrates with third-party applications Discussing HANA s role in real-time data processing for feature code and part integration Understanding SAP data intelligence and its integration with HANA Understanding HANA replication in a multi-mode environment Discussing backup, restore and security measure in HANA Good Communication and interpersonal Skills Willing to work in N. America To manage the product definition activities involving with cross functional coordination and enabling the ordering parameters for North America region
Posted 3 weeks ago
4.0 - 9.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
KLDiscovery, a leading global provider of electronic discovery, information governance and data recovery services, is currently seeking a Senior Software Engineer for an exciting new opportunity. This person will develop core parts of our eDiscovery offerings, including software development, testing, and systems automation. They will collaborate with team members, product owners, designers, architects, and other development teams to research relevant technologies and build innovative solutions that enhance our offerings and exceed customer needs. If you like working in a creative, technology-driven, high energy, collaborative, casual environment, and you have strong software development abilities, this is the opportunity for you! Hybrid or remote, work from home opportunity. Responsibilities Create, Validate and Review program code per specifications. Develop automated unit and API tests. Support bug fixes and implement enhancements to applications in Production. Create, design and review SW documentation. Utilize, communicate, and enforce coding standards. Provide Technical Support to applications in Production within defined SLA. Adhere to development processes and workflows. Assist and mentor team demonstrating technical excellence. Detects problems and areas that need improvement early and raises issues. Qualifications Fluent English (C1) At least 4 years of commercial, hands-on software development experience in C#/.NET and C++ Experience with ASP.NET Core Blazor Experience with desktop applications (Winforms preferred) Experience with background jobs and workers (e.g. Hangfire) Experience with Angular is a plus Creating dataflow/sequence/C4 diagrams Good understanding of at least one of architectural/design patterns: MVC / MVP / MVVM / Clean / Screaming / Hexagonal architectures .NET memory model and performance optimizations solutions Writing functional tests. Writing structure tests. Understanding modularity and vertical slices. Data privacy and securing desktop apps. Ability to design functionalities based on requirements Experience with Entity Framework Core Our Cultural Values Entrepreneurs at heart, we are a customer first team sharing one goal and one vision. We seek team members who are: Humble - No one is above another; we all work together to meet our clients needs and we acknowledge our own weaknesses Hungry - We all are driven internally to be successful and to continually expand our contribution and impact Smart - We use emotional intelligence when working with one another and with clients Our culture shapes our actions, our products, and the relationships we forge with our customers. Who We Are KLDiscovery provides technology-enabled services and software to help law firms, corporations, government agencies and consumers solve complex data challenges. The company, with offices in 26 locations across 17 countries, is a global leader in delivering best-in-class eDiscovery, information governance and data recovery solutions to support the litigation, regulatory compliance, internal investigation and data recovery and management needs of our clients. Serving clients for over 30 years, KLDiscovery offers data collection and forensic investigation, early case assessment, electronic discovery and data processing, application software and data hosting for web-based document reviews, and managed document review services. In addition, through its global Ontrack Data Recovery business, KLDiscovery delivers world-class data recovery, email extraction and restoration, data destruction and tape management. KLDiscovery has been recognized as one of the fastest growing companies in North America by both Inc. Magazine (Inc. 5000) and Deloitte (Deloitte s Technology Fast 500) and CEO Chris Weiler has been honored as a past Ernst Young Entrepreneur of the Year . Additionally, KLDiscovery is an Orange-level Relativity Best in Service Partner, a Relativity Premium Hosting Partner and maintains ISO/IEC 27001 Certified data centers. KLDiscovery is an Equal Opportunity Employer. #LI-SN1 #LI-Remote
Posted 3 weeks ago
2.0 - 7.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
KLDiscovery, one of the largest US eDiscovery providers, is currently seeking a Software Engineer II for an exciting new opportunity. This person will develop core parts of our eDiscovery offerings, including software development, testing, and systems automation. They will collaborate with team members, product owners, designers, architects, and other development teams to research relevant technologies and build innovative solutions that enhance our offerings and exceed customer needs. Remote, work from home opportunity. Responsibilities Create, Validate and Review program code per specifications. Review and analyze applications Develop automated unit and API tests. Develop automated upgrade and deployment mechanisms Validate Releases Support bug fixes and implement enhancements to applications in Production. Create, design and review SW documentation. Utilize existing coding standards. Provide Technical Support to applications in Production within defined SLA. Adhere to development processes and workflows. Work in a team or with multiple organizational departments in a dynamic and rapidly changing environment. Seek and participate in personal development opportunities to maintain a detailed knowledge of industry standards, engineering best practices, and business needs Qualifications Fluent English (C1) BS or BE or B. Tech in Computer Science, Engineering, or related scientific field preferred 2+ years of experience in .NET/C# and front-end experience- Angular preferred Good knowledge in unit/integration testing SQL databases, preferably MS SQL and PostgreSQL Analytical and problem-solving skills Focus on continuous improvement Excellent communication skills Being a team player Good knowledge of English Our Cultural Values Entrepreneurs at heart, we are a customer first team sharing one goal and one vision. We seek team members who are: Humble - No one is above another; we all work together to meet our clients needs and we acknowledge our own weaknesses Hungry - We all are driven internally to be successful and to continually expand our contribution and impact Smart - We use emotional intelligence when working with one another and with clients Our culture shapes our actions, our products, and the relationships we forge with our customers. Who We Are KLDiscovery provides technology-enabled services and software to help law firms, corporations, government agencies and consumers solve complex data challenges. The company, with offices in 26 locations across 17 countries, is a global leader in delivering best-in-class eDiscovery, information governance and data recovery solutions to support the litigation, regulatory compliance, internal investigation and data recovery and management needs of our clients. Serving clients for over 30 years, KLDiscovery offers data collection and forensic investigation, early case assessment, electronic discovery and data processing, application software and data hosting for web-based document reviews, and managed document review services. In addition, through its global Ontrack Data Recovery business, KLDiscovery delivers world-class data recovery, email extraction and restoration, data destruction and tape management. KLDiscovery has been recognized as one of the fastest growing companies in North America by both Inc. Magazine (Inc. 5000) and Deloitte (Deloitte s Technology Fast 500). Additionally, KLDiscovery is an Orange-level Relativity Best in Service Partner, a Relativity Premium Hosting Partner and maintains ISO/IEC 27001 Certified data centers. KLDiscovery is an Equal Opportunity Employer. #LI-KV1 #LI-Remote
Posted 3 weeks ago
1.0 - 3.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Calix is hiring Security Researcher to join their Threat Intelligence team in Bangalore. The successful candidates will lead efforts in identifying, analyzing, and mitigating network threats. Develop and implement advanced threat detection and prevention strategies. Additionally, they will enhance threat detection capabilities through various research activities. Key Responsibilities: Write, test, and optimize IPS signatures using the companys proprietary signature language and detection engine to identify network-based intrusions and malicious activities. Learn and master the proprietary signature syntax, functions, and capabilities to develop effective detection rules across various protocols and attack vectors. Analyze network traffic patterns and packet captures to create custom signatures tailored to the proprietary IPS platforms unique detection capabilities. Collaborate with engineering teams to understand platform-specific limitations and optimize signature performance within the proprietary engine environment. Support and contribute to internal automation processes designed to streamline threat detection activities and signature deployment workflows. Work closely with senior researchers and team leads to learn advanced detection methodologies and complex signature development techniques. Capture, analyze, and process indicators of compromise from diverse third-party threat intelligence sources and internal security tools. Monitor security advisories and vulnerability disclosures to identify new detection opportunities within the proprietary platform framework. Document research findings and contribute to knowledge sharing initiatives that benefit the broader security research team. Gain hands-on experience with complex detection capabilities by shadowing senior team members during high-priority investigations and research projects. Develop expertise in the proprietary platform through guided learning sessions and practical application under senior supervision. Qualifications: Bachelors/Masters degree in Computer Science, Electrical Engineering, Cyber Security or a related field. 1-3 years of experience in cybersecurity, network security, or threat detection roles. Understanding of core network concepts including TCP/IP, protocols, routing, and traffic analysis. Familiarity with network security tools, IDS/IPS, and malware prevention/monitoring solutions. Working experience on signature writing and threat detection technologies. Working Knowledge of both Windows and Linux based operating systems. Basic scripting capabilities in Python or similar languages for automation and data processing tasks. Must thrive within a team environment as well as on an individual basis. Natural curiosity and willingness to learn in a fast-paced, ever-changing threat landscape. A passion for cybersecurity, continuous improvement, and staying current on threat trend.
Posted 3 weeks ago
3.0 - 4.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Role: Data Engineer Role Type: Individual Contributor Experience: 3-4 years Who Are We BimaKavach is reimagining how Indian businesses access protection with technology, speed, and simplicity at the core of everything we do. We proudly serve 3,000+ companies, including names like BSNL, Daikin, The Whole Truth, and CleverTap , and are backed by top investors like Waterbridge, Blume, Arali, and Eximius. Our missionTo safeguard every Indian business by 2047. Our mindsetBold, fast-moving, and customer-obsessed. Join us at BimaKavach and be part of a once-in-a-generation opportunity to reshape how insurance works for millions of businesses. Bring your expertise, curiosity, and ambition and help build the future of SME insurance in India. Job Overview: As a Data Engineer at BimaKavach, you will be pivotal in building and maintaining the scalable data infrastructure and pipelines that drive our data-driven decision-making. You will work with large datasets, ensuring data quality, accessibility, and reliability for analytics, reporting, and machine learning initiatives within the insurance domain. This role requires strong expertise in data warehousing, ETL processes, and cloud-based data solutions. Key Responsibilities: Design, build, and maintain robust and scalable data pipelines for data ingestion, transformation, and loading from various sources into our data warehouse. Develop and optimize ETL/ELT processes using appropriate tools and technologies. Work extensively with PostgreSQL for data storage, querying, and optimization. Manage data infrastructure on AWS EC2 and leverage other AWS services (e.g., S3, RDS) for data storage and processing. Ensure data quality, consistency, and reliability across all data pipelines and datasets. Collaborate with data scientists, analysts, and product teams to understand data requirements and deliver actionable insights. Implement monitoring and alerting for data pipelines to ensure data integrity and system health. Troubleshoot and resolve data-related issues, optimizing queries and data models for performance. Contribute to data governance, security, and compliance best practices. (Good to have): Experience with serverless functions (AWS Lambda/Google Cloud Functions) for event-driven data processing. Qualifications: Bachelors or Masters degree in Computer Science, Engineering, Data Science, or a related quantitative field. 3-4 years of professional experience in data engineering. Strong proficiency in SQL, especially with PostgreSQL. Proven experience building and maintaining data pipelines. Hands-on experience with AWS services, particularly EC2, and familiarity with other relevant services (S3, RDS, Glue, Redshift etc.). Experience with scripting languages (e.g., Python, Node.js) for data manipulation and automation. Understanding of data warehousing concepts, data modeling, and ETL/ELT processes. Experience with big data technologies (e.g., Apache Spark, Hadoop) is a plus. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. (Good to have): Experience with AWS Lambda or Google Cloud Functions for data processing. Key Details: Joining : ASAP Compensation: Market competitive pay along with a variable performance-based component Location : Bangalore or Indore
Posted 3 weeks ago
1.0 - 3.0 years
5 - 8 Lacs
Pune
Work from Office
The resource will be working on and be responsible for SAP VC Data Maintenance for IA business. It includes following key activities related to VC Material Business data mapping as per SAP VC data Objects Preparation of model configuration data Template Preparation of manufacturing (Super BOM / Super Routing) data Template Preparation of Pricing Data Template Preparation of Purchase Info Records data Template VC Data Load in SAP Quality Client for business validation (Model configuration data, Super BOM, Super Routing, Dependencies, Pricing etc.) Co-ordinate with Global Data Team and setup VC Data in SAP Production client, after business validation in SAP Quality Client Support business validation in SAP Quality Client for any queries and fixing the issues/defects Resolution of VC Data related issues in SAP production client As a part of on-going data maintenance, implement changes in the existing VC data, based on the business inputs, using Engineering Change Management process Preparation and circulation of progress report periodically. Active participation in meetings/calls with customers and within team Education Qualification: B.E/BCS/MCA/MCS/MCM 1-3 years of experience in data processing of any ERP system Proficient in Microsoft office Excel, Word, Outlook Ability to communicate clearly in English, both written and verbal Courteous approach to work with peers and with customers Independent thinker and self-starter within a team context Ability to work in global environment
Posted 3 weeks ago
1.0 - 6.0 years
4 - 8 Lacs
Rohtak
Work from Office
Education : B.E./B.Tech/M. Tech with distinction. Job Responsibility : Team Member or Team Leader for Measurement, Data Analysis, and counter-measure proposal for improving the Sound Quality and Customer experience of Driving: Capable of carrying out Noise and Vibration measurement. Analyzing the Sound Spectrum for Psychoacoustic Parameters like Roughness, Sharpness, Loudness, Articulation Index, Tonality etc. Capable of Performing Jury Evaluation and develop framework for subjective and objective data correlation. Collaborate with design, simulation, and product development teams to implement NVH optimization strategies during the product lifecycle based on Sound Quality inputs. Competency Requirements : Experience with sound quality analysis, including subjective/objective evaluations, using tools such as HEAD Acoustics or similar. Experience in NVH Test data analysis software like Siemens Test Lab, BK connect, BK Sonoscout, Muller BBM Pak, etc. Hands on experience on using measurement sensors - microphones, accelerometers, tacho sensors, CAN data loggers, Binaural Headset, Volume Velocity Sources, Data acquisition systems, etc. Expertise in using Jury Evaluation Software s Hands-on experience with Python for data processing, process automation, and machine learning model development is a plus. Team player, good analytical/ presentation skills decision-making skills. Future planning conceptualization thinking. Subordinate development motivational skills Specific Expertise : Mastery in recording, mixing, and masking noise techniques is crucial, and know-how of latest audio software and hardware innovations. Skill in driving on High Speed and different road patterns
Posted 3 weeks ago
6.0 - 9.0 years
50 - 100 Lacs
Pune
Work from Office
Experience in Deep learning engineering (mostly on MLOps) Strong NLP/LLM experience and processing text using LLM Proficient in Pyspark/Databricks Python programming. Building backend applications (data processing etc) using Python and Deep learning frame works. Deploying models and building APIS (FAST API, FLASK API) Need to have experience working with GPUS. Working knowledge of Vector databases like 1) Milvus 2) azure cognitive search 3) quadrant etc Experience in transformers and working with hugging face models like llama, Mixtral AI and embedding models etc. Job Requirements Details Experience in Deep learning engineering (mostly on MLOps) Strong NLP/LLM experience and processing text using LLM Proficient in Pyspark/Databricks Python programming. Building backend applications (data processing etc) using Python and Deep learning frame works. Deploying models and building APIS (FAST API, FLASK API) Need to have experience working with GPUS. Working knowledge of Vector databases like 1) Milvus 2) azure cognitive search 3) quadrant etc Experience in transformers and working with hugging face models like llama, Mixtral AI and embedding models etc. Exp 6 - 9 yrs Location Pune - Hybrid Good to have Knowledge and experience in Kubernetes, docker etc Cloud Experience working with VMS and azure storage. Sound data engineering experience. #LI-ONSITE#LI-AP1 Pay Range
Posted 3 weeks ago
0.0 - 1.0 years
8 - 12 Lacs
Mohali
Work from Office
Assist in the development and implementation of machine learning models and algorithms. Write clean, efficient, and reusable Python code for for data processing, modeling, and deployment. Work on datasets: data cleaning, preprocessing, feature engineering, and visualization. Collaborate with senior data scientists and developers to build scalable ML solutions. Conduct literature reviews and stay updated on the latest ML/AI research. Contribute to model deployment and integration with APIs or web applications (basic knowledge sufficient).
Posted 3 weeks ago
2.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
At PowerSchool, we are a dedicated team of innovators guided by our shared purpose of powering personalized education for students around the world. From the central office to the classroom to the home, PowerSchool supports the entire educational ecosystem as the global leader of cloud-based software for K-12 education. Our employees make it all possible, and a career with us means you re joining a successful team committed to engaging, empowering, and improving the K-12 education experience everywhere. Team Overview Our Sales team serves the critical role of getting our products into the hands of our customers. This talented team works hard to build and maintain relationships with our clients, as well as looks for new business opportunities to expand our reach. Responsibilities Description The Sales Operations team focuses on optimizing the sales processes, managing data, and improving efficiency within the Sales team. The team handles tasks such as forecasting, pipeline management, and sales analytics to drive successful outcomes. Your day-to-day job will consist of: Provides cross-functional support and assists Sales Leaders in the development of new tools, processes and projects as needed. Maintain common Sales Ops files (Territory rosters, Quota and Compensation files, Sales Org Charts, job descriptions, UKG data integrity files, holdovers, etc) Help create and distribute new incentive compensation plans and amendments for Sales Organization. Provide overall support and departmental coordination for Sales Operations team inquiries, sales performance reporting and analysis requests, and ad-hoc projects. Work inside Salesforce.com to support the Sales organization with opportunity validating, territory updates, data questions, reporting/dashboard administration, order processing, data matching, and approval submissions. Any and all data load requests for new hires, transfers, SFDC clean up efforts Research any SFDC data concerns and provide recommendations to leadership Update and maintain all job descriptions for sales organization Qualifications Minimum Qualifications Bachelor s Degree required 2+ years of sales operations experience Excellent verbal and interpersonal skills Experience working in SFDC (reporting, dashboards) Must have Excel knowledge (tables, pivots, xlookup, etc). Proficient in MS Office Suite High energy and positive problem-solving attitude Ability to work under pressure with strict deadlines. Detail oriented Ability to multi-task EEO Commitment EEO Commitment PowerSchool is committed to a diverse and inclusive workplace. PowerSchool is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. Our inclusive culture empowers PowerSchoolers to deliver the best results for our customers. We not only celebrate the diversity of our workforce, we celebrate the diverse ways we work.
Posted 3 weeks ago
0.0 years
1 - 5 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Call Handling, Messaging: Answer inbound calls from job seekers, listen to their needs, and qualify them. Provide information on WhatsApp. Pass lead to recruitment team for qualified leads - in a professional and timely manner. Work From Home Required Candidate profile Immediate Joiner Work From Home Candidate should be from Hyderabad, New Delhi, Mumbai, Pune, Bangalore,
Posted 3 weeks ago
2.0 - 5.0 years
13 - 17 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com
Posted 3 weeks ago
0.0 years
1 - 3 Lacs
Ahmedabad
Work from Office
Ready to shape the future of work? At Genpact, we don't just adapt to change we drive it. AI and digital innovation are redefining industries and were leading the charge. Genpacts AI Gigafactory, our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook. Mega Virtual Drive for Customer Service roles -English+ Hindi Language on 2nd June 2025 (Monday) || Ahmedabad Location Date: 2-June-2025 (Monday) MS Teams meeting ID: 448 855 564 400 5 MS Teams Passcode: g6p5m9AU Time: 12:00 PM - 1:00 PM Job Location: Ahmedabad (Work from office) Languages Known: Hindi+English Shifts: Flexible with any shift Responsibilities • Respond to customer queries and customer's concern • Provide support for data collection to enable Recovery of the account for end user. • Maintain a deep understanding of client process and policies • Reproduce customer issues and escalate product bugs • Provide excellent customer service to our customers • You should be responsible to exhibit capacity for critical thinking and analysis. • Responsible to showcase proven work ethic, with the ability to work well both independently and within the context of a larger collaborative environment Qualifications we seek in you Minimum qualifications • Graduate (Any Discipline except law) • Only Freshers are eligible • Fluency in English & Hindi language is mandatory Preferred qualifications • Effective probing skills and analyzing / understanding skills • Analytical skills with customer centric approach • Excellent proficiency with written English and with neutral English accent • You should be able to work on a flexible schedule (including weekend shift) Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. **Note: Please keep your E-Aadhar card handy while appearing for interview.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France