Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Oracle Cloud Infrastructure (OCI) is a strategic growth area for Oracle. It is a comprehensive cloud service offering spanning Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). OCI is committed to providing the best in cloud products that meet the needs of our customers who are tackling some of the world’s biggest challenges. About The Network Monitoring (NM) Team Networking is a mission critical part of the OCI cloud. Our customers want higher availability, more visibility, greater network security, better network performance and throughput, better capacity planning, root cause analysis, and prediction of failures. We help Oracle Cloud Infrastructure (OCI) build the best-in-class cloud monitoring solutions to provide performance monitoring, what-if analysis, AI enabled root cause analysis, prediction, and capacity planning for Oracle’s global cloud network infrastructure. Our mission is to build monitoring services that comprehensively view, analyze, plan, and optimize to scale and operate our networks. Responsibilities We are looking for a Senior Member of Technical Staff for the OCI Network Monitoring team who has the expertise and passion in solving difficult problems in globally distributed systems, building cloud native observability and analytics solutions at scale using innovative AI/ML solutions. You should be comfortable at building complex distributed AI/ML systems that excel at huge amount of data handling, involving collecting metrics, building data pipelines, and generating analytics using AI/ML for real-time processing and batch processing. If you are passionate about designing, developing, testing, and delivering AI/ML based observability services, are excited to learn and thrive in a fast-paced environment, the NM team is the place for you. Required Qualifications/Desired Qualifications: (Adjust as per focus area) BS/MS or Equivalent in CS or equivalent relevant area 7+ years of experience in software development 1-2 years of experience in developing AI/ML applications using ML models Proficiency with Java/Python/C++ and Object-Oriented programming Networking protocol knowledge such as TCP/IP/Ethernet/BGP/OSPF Networking Management Technologies such as SNMP, Netflow, BGP Monitoring Protocol, gNMI Excellent knowledge of data structures, search/sort algorithms Excellent organizational, verbal, and written communication skills Knowledge of cloud computing & networking technologies including monitoring services Operational experience running and troubleshooting large networks Experience developing service-oriented systems Exposure to Hadoop, Spark, Kafka, Storm, open TSDB, Elastic Search or other distributed compute platforms Exposure to LLM frameworks such as Langchain and LlamaIndex Exposure to LLMs such as GPT-4, Llama 3.1, Cohere Command Experience with Jira, Confluence, BitBucket Knowledge of Scrum & Agile Methodologies Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 week ago
2.0 - 3.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description The Data Engineer supports, develops, and maintains a data and analytics platform to efficiently process, store, and make data available to analysts and other consumers. This role collaborates with Business and IT teams to understand requirements and best leverage technologies for agile data delivery at scale. Note:- Even though the role is categorized as Remote, it will follow a hybrid work model. Key Responsibilities Implement and automate deployment of distributed systems for ingesting and transforming data from various sources (relational, event-based, unstructured). Develop and operate large-scale data storage and processing solutions using cloud-based platforms (e.g., Data Lakes, Hadoop, HBase, Cassandra, MongoDB, DynamoDB). Ensure data quality and integrity through continuous monitoring and troubleshooting. Implement data governance processes, managing metadata, access, and data retention. Develop scalable, efficient, and quality data pipelines with monitoring and alert mechanisms. Design and implement physical data models and storage architectures based on best practices. Analyze complex data elements and systems, data flow, dependencies, and relationships to contribute to conceptual, physical, and logical data models. Participate in testing and troubleshooting of data pipelines. Utilize agile development technologies such as DevOps, Scrum, and Kanban for continuous improvement in data-driven applications. Responsibilities Qualifications, Skills, and Experience: Must-Have 2-3 years of experience in data engineering with expertise in Azure Databricks and Scala/Python. Hands-on experience with Spark (Scala/PySpark) and SQL. Strong understanding of SPARK Streaming, SPARK Internals, and Query Optimization. Proficiency in Azure Cloud Services. Agile Development experience. Experience in Unit Testing of ETL pipelines. Expertise in creating ETL pipelines integrating ML models. Knowledge of Big Data storage strategies (optimization and performance). Strong problem-solving skills. Basic understanding of Data Models (SQL/NoSQL) including Delta Lake or Lakehouse. Exposure to Agile software development methodologies. Quick learner with adaptability to new technologies. Nice-to-Have Understanding of the ML lifecycle. Exposure to Big Data open-source technologies. Experience with clustered compute cloud-based implementations. Familiarity with developing applications requiring large file movement in cloud environments. Experience in building analytical solutions. Exposure to IoT technology. Competencies System Requirements Engineering: Translates stakeholder needs into verifiable requirements. Collaborates: Builds partnerships and works collaboratively with others. Communicates Effectively: Develops and delivers clear communications for various audiences. Customer Focus: Builds strong customer relationships and delivers customer-centric solutions. Decision Quality: Makes timely and informed decisions to drive progress. Data Extraction: Performs ETL activities from various sources using appropriate tools and technologies. Programming: Writes and tests computer code using industry standards, tools, and automation. Quality Assurance Metrics: Applies measurement science to assess solution effectiveness. Solution Documentation: Documents and communicates solutions to enable knowledge transfer. Solution Validation Testing: Ensures configuration changes meet design and customer requirements. Data Quality: Identifies and corrects data flaws to support governance and decision-making. Problem Solving: Uses systematic analysis to identify and resolve issues effectively. Values Differences: Recognizes and values diverse perspectives and cultures. Qualifications Education, Licenses, and Certifications: College, university, or equivalent degree in a relevant technical discipline, or equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Work Schedule Work primarily with stakeholders in the US, requiring a 2-3 hour overlap during EST hours as needed. Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2411641 Relocation Package No Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Oracle Cloud Infrastructure (OCI) is a strategic growth area for Oracle. It is a comprehensive cloud service offering spanning Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). OCI is committed to providing the best in cloud products that meet the needs of our customers who are tackling some of the world’s biggest challenges. About The Network Monitoring (NM) Team Networking is a mission critical part of the OCI cloud. Our customers want higher availability, more visibility, greater network security, better network performance and throughput, better capacity planning, root cause analysis, and prediction of failures. We help Oracle Cloud Infrastructure (OCI) build the best-in-class cloud monitoring solutions to provide performance monitoring, what-if analysis, AI enabled root cause analysis, prediction, and capacity planning for Oracle’s global cloud network infrastructure. Our mission is to build monitoring services that comprehensively view, analyze, plan, and optimize to scale and operate our networks. Responsibilities We are looking for a Senior Member of Technical Staff for the OCI Network Monitoring team who has the expertise and passion in solving difficult problems in globally distributed systems, building cloud native observability and analytics solutions at scale using innovative AI/ML solutions. You should be comfortable at building complex distributed AI/ML systems that excel at huge amount of data handling, involving collecting metrics, building data pipelines, and generating analytics using AI/ML for real-time processing and batch processing. If you are passionate about designing, developing, testing, and delivering AI/ML based observability services, are excited to learn and thrive in a fast-paced environment, the NM team is the place for you. Required Qualifications/Desired Qualifications: (Adjust as per focus area) BS/MS or Equivalent in CS or equivalent relevant area 7+ years of experience in software development 1-2 years of experience in developing AI/ML applications using ML models Proficiency with Java/Python/C++ and Object-Oriented programming Networking protocol knowledge such as TCP/IP/Ethernet/BGP/OSPF Networking Management Technologies such as SNMP, Netflow, BGP Monitoring Protocol, gNMI Excellent knowledge of data structures, search/sort algorithms Excellent organizational, verbal, and written communication skills Knowledge of cloud computing & networking technologies including monitoring services Operational experience running and troubleshooting large networks Experience developing service-oriented systems Exposure to Hadoop, Spark, Kafka, Storm, open TSDB, Elastic Search or other distributed compute platforms Exposure to LLM frameworks such as Langchain and LlamaIndex Exposure to LLMs such as GPT-4, Llama 3.1, Cohere Command Experience with Jira, Confluence, BitBucket Knowledge of Scrum & Agile Methodologies Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 week ago
12.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Description As a Care Engineer for Nokia Mediation, you'll deliver end-to-end global support while combining technical expertise with hands-on development—owning code customizations, fixes, and delivery to customers. You'll act as a technical leader and mentor, contributing to global improvement initiatives and serving as a Subject Matter Expert. While the role is primarily remote, it may occasionally involve on-site customer visits. You'll also be part of a 24x7 support rotation, ensuring high service availability across supported regions. How You Will Contribute And What You Will Learn Deliver end-to-end (L2–L4) support for Nokia’s Digital Business suite—primarily Nokia Mediation—ensuring timely resolution of customer issues within SLA through root cause analysis, solution delivery, and source code fixes. Meet and exceed Care quality standards and KPIs while actively contributing to a high-performance, innovation-driven support culture. Collaborate with cross-functional teams to address support and project-related needs efficiently and effectively. Engage directly with customers, requiring strong communication skills and the ability to manage expectations in high-pressure environments. Participate in 24x7 emergency support rotations, while contributing to continuous improvement initiatives focused on Care efficiency, product enhancement, and overall customer experience. Key Skills And Experience You have: A Bachelor's / Master's degree or equivalent with over 12 years of hands-on experience in technical support, service deployment, or software development for complex software applications, including L3/L4 support and R&D involvement At least 10 years of practical expertise in UNIX/Linux scripting with strong proficiency in operating systems and shell environments Proven experience with databases and programming languages, including RedisDB, Postgres, MariaDB, Oracle, Hadoop, SQL, Java, C, Perl, and PL/SQL Strong knowledge of networking, IP protocols, and cloud technologies, along with virtualization and clustering platforms such as OpenStack, VMware vSphere, OpenShift, and Kubernetes It would be Good if you also had: Motivated, independent, and able to build and maintain good relationship with customers and internal stakeholders About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed. About The Team As Nokia's growth engine, we create value for communication service providers and enterprise customers by leading the transition to cloud-native software and as-a-service delivery models. Our inclusive team of dreamers, doers and disruptors push the limits from impossible to possible. Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description: About us* At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* GF (Global Finance) Global Financial Control India (GFCI) is part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business (LOBs) and Enterprise Finance functions. The capabilities hosted include General Accounting & Reconciliations, Legal Entity Controllership, Corporate Sustainability Controllership, Corporate Controllership, Management Reporting & Analysis, Finance Systems Support, Operational Risk and Controls, Regulatory Reporting and Strategic initiatives. The Financed Emissions Accounting & Reporting team, a part of the Global Financial Control-Corporate Sustainability Controller organization within the CFO Group, plays a critical role in supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Job Description* The role is responsible for building data sourcing process, data research and analytics using available tools, support model input data monitoring and develop necessary data or reporting frameworks to support our approaches to net zero progress alignment, target setting, client engagement and reputational risk review, empowering banking teams to assist clients on net zero financing strategies and specific commercial opportunities. The role will support and partner with business stakeholders in the Enterprise Climate Program Office, Technology, Climate and Credit Risk, the Global Environment Group, Lines of Business, Legal Entity Controllers and Model Risk Management. Additionally, the role will support data governance, lineage, controls by building, improving and executing data processes. Candidate must be able to communicate across technology partners, climate office and the business lines to execute on viable analytical solutions, with a focus on end-user experience and usability. Candidate must be strong in identifying and explaining data quality issues to help achieve successful and validated data for model execution. This individual should feel at ease creating complex SQL queries and extracting large, raw datasets from various sources, merging, and transforming raw data into usable data and analytic structures, and benchmarking results to known. They must feel comfortable with automating repeatable process, generating data insights that are easy for end users to interpret, conduct quantitative analysis, as well as effectively communicate and disseminate findings and data points to stakeholders. They should also understand greenhouse gas accounting frameworks and financed emissions calculations as applied to different sectors and asset classes. The candidate will have experience representing ERA with critical Climate stakeholders across the firm, and should demonstrate capacity for strategic leadership, exercising significant independent judgment and discretion and work towards strategic goals with limited oversight. Responsibilities* Net zero transition planning and execution: Partners with GEG, Program Office and Lines of Business in developing and executing enterprise-wide net zero transition plan and operational roadmap, with a focus on analysis and reporting capabilities, data procurement, liaising with consultants, external data providers, Climate Risk and Technology functions. Data development & Operations: Research on data requirements, produce executive level and detailed level data summary, validate the accuracy, completeness, reasonableness, timeliness on the dataset and develop desktop procedures for BAU operations. Perform data review and test technology implementation for financed emissions deliverables. Execute BAU processes such as new data cycle creation, execute data controls and data quality processes. Produce data summary materials and walk through with leadership team. Data Analytics & Strategy: Analyze the data and provide how granular data movements across history affects the data new results. Find trends of data improvements or areas for improvement. Develops automated data analysis results and answer the common questions to justify the changes in data. Support ad hoc analytics of bank-wide and client net zero commitment implementation, with an initial focus on automation of financed emissions analysis, reporting against PCAF standards and net zero transition preparedness analytics and engagement to enhance strategy for meeting emissions goals for target sectors. Requirements* Education* Bachelor’s degree in data management or analytics, engineering, sustainability, finance or other related field OR master’s degree in data science, earth/climate sciences, engineering, sustainability, natural resource management, environmental economics, finance or other related field Certifications If Any NA Experience Range* Minimum 15+ years in Climate, Financed Emissions, finance, financial reporting Three (3) or more years of experience in statistical and/or data management and analytics and visualization (intersection with financial services strongly preferred) Foundational skills* Deep expertise in SQL, Excel, automation & optimization, and project management Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Experience in extracting, and combining data across from multiple sources, and aggregate data to support model development. Experience in multiple database environment such as Oracle, Hadoop, and Teradata Deep expertise in SQL, Excel, Python, automation & optimization, and project management Strong technical and visualization skills, with the ability to understand the business goals, needs, and be committed to delivering recommendations that will guide strategic decisions. Knowledge on Alteryx, Tableau, R, (knowledge of NLP, data scraping and generative AI welcome) Strong leadership skills and proven ability in motivating employees and promoting teamwork. Excellent interpersonal, management, and teamwork skills. High level of significant independent decision-making ability. Highly motivated self-starter with excellent time management skills and the ability to effectively manage multiple priorities and timelines. Demonstrated ability to motivate others in a high-stress environment to achieve goal. Ability to effectively communicate and resolve conflicts by both oral and written communication to both internal and external clients. Ability to adapt to a dynamic and evolving work environment. Well-developed analytical and problem-solving skills. Experience and knowledge of the principles and practices of management and employee development. Ability to think critically to solve problems with rational solutions. Ability to react and make decisions quickly under pressure with good judgment. Strong documentation & presentation skills to explain the data analysis in a visual and procedural way based on the audience. Ability to quickly identify risks and determine reasonable solutions. Desired Skills Advanced knowledge of Finance Advanced knowledge of Climate Risk Work Timings* Window 12:30 PM to 9:30 PM (9 hours shift, may require stretch during peak period) Job Location* Mumbai Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 - 1 Lacs
Hyderabad
Hybrid
Location: Hyderabad (Hybrid) Please share your resume with +91 9361912009 Roles and Responsibilities Deep understanding of Linux, networking and security fundamentals. Experience working with AWS cloud platform and infrastructure. Experience working with infrastructure as code with Terraform or Ansible tools. Experience managing large BigData clusters in production (at least one of -- Cloudera, Hortonworks, EMR). Excellent knowledge and solid work experience providing observability for BigData platforms using tools like Prometheus, InfluxDB, Dynatrace, Grafana, Splunk etc. Expert knowledge on Hadoop Distributed File System (HDFS) and Hadoop YARN. Decent knowledge of various Hadoop file formats like ORC, Parquet, Avro etc. Deep understanding of Hive (Tez), Hive LLAP, Presto and Spark compute engines. Ability to understand query plans and optimize performance for complex SQL queries on Hive and Spark. Experience supporting Spark with Python (PySpark) and R (SparklyR, SparkR) languages Solid professional coding experience with at least one scripting language - Shell, Python etc. Experience working with Data Analysts, Data Scientists and at least one of these related analytical applications like SAS, R-Studio, JupyterHub, H2O etc. Able to read and understand code (Java, Python, R, Scala), but expertise in at least one scripting languages like Python or Shell. Nice to have skills: Experience with workflow management tools like Airflow, Oozie etc. Knowledge in analytical libraries like Pandas, Numpy, Scipy, PyTorch etc. Implementation history of Packer, Chef, Jenkins or any other similar tooling. Prior working knowledge of Active Directory and Windows OS based VDI platforms like Citrix, AWS Workspaces etc.
Posted 1 week ago
4.0 - 6.0 years
8 - 18 Lacs
Bengaluru
Work from Office
We are seeking a skilled Data Engineer & Data Analyst with over 4 years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools. Key Responsibilities: Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer (GCP), Control-M, Cron, Luigi, and similar tools. Build and optimize data architectures including data lakes and data warehouses. Integrate data from multiple sources ensuring data quality and consistency. Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions. Analyze complex datasets to identify trends, generate actionable insights, and support decision-making. Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation. Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB. Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery. Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools. Ensure data governance, security, and compliance standards are met. Participate in Agile and DevOps processes to enhance data engineering workflows. Required Qualifications: 4+ years of professional experience in data engineering and data analysis roles. Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB. Hands-on experience with big data tools like Hadoop and Apache Spark. Proficient in Python programming. Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks. Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi. Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory). Understanding of data modeling concepts and data lake/data warehouse architectures. Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows. Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB. Exposure to Agile and DevOps methodologies. Experience with at least one cloud platform: Google Cloud Platform (BigQuery, Dataflow, Composer, Cloud Storage, Pub/Sub) Amazon Web Services (S3, Glue, Redshift, Lambda, Athena) Microsoft Azure (Data Factory, Synapse Analytics, Blob Storage) Preferred Skills: Strong problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Experience with service development, REST APIs, and automation testing is a plus. Familiarity with version control systems and workflow automation.
Posted 1 week ago
0.0 - 4.0 years
3 - 7 Lacs
Pune
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Scala programming. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
7.0 - 12.0 years
17 - 22 Lacs
Bengaluru
Work from Office
About The Role Key Responsibilities: 1. MuleSoft Development (PCE Instance): ? Design, develop, and deploy integration solutions using MuleSoft Anypoint Platform with PCE (Private Cloud Edition). ? Create and maintain APIs, integrations, and workflows using Mule 4.x. ? Develop and manage RAML/OpenAPI specifications for RESTful APIs. ? Implement DataWeave transformations for complex data mappings. ? Ensure integrations meet performance, security, and reliability standards. 2. EDI Integrations: ? Develop and manage EDI transactions such as 850 (Purchase Order), 855 (PO Acknowledgment), 856 (Advance Shipment Notice), 846 (Inventory Advice), and others. ? Design and implement EDI mapping and transformation processes within MuleSoft. ? Troubleshoot and resolve EDI-related errors and exceptions. 3. API Integrations: ? Develop RESTful and SOAP APIs for real-time data exchange between systems. ? Implement API policies (security, throttling, logging) within Anypoint Platform. ? Monitor API performance and ensure error handling and resiliency. 4. AS2 Connection Setup: ? Set up and manage AS2 connections for secure data exchange with trading partners. ? Configure certificates, endpoints, and MDN acknowledgments. ? Ensure AS2 compliance with industry standards and troubleshoot connectivity issues. 5. Monitoring and Support: ? Monitor integration pipelines using MuleSoft Anypoint Monitoring. ? Handle incident management, including root cause analysis and performance optimization. ? Maintain detailed documentation of integrations, APIs, and EDI workflows.
Posted 1 week ago
5.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role _x000D_ Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? _x000D_ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? _x000D_ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? _x000D_ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? _x000D_ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Hadoop_x000D_. Experience5-8 Years_x000D_. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
5.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Starburst Data Engineer/Architect Expertise in Starburst and policy management like Ranger or equivalent. In-depth knowledge of data modelling principles and techniques, including relational and dimensional. Excellent problem solving skills and the ability to troubleshoot and debug complex data related issues. Strong awareness of data tools and platforms like: Starburst, Snowflakes, Databricks and programming languages like SQL. In-depth knowledge of data management principles, methodologies, and best practices with excellent analytical, problem-solving and decision making skills. Develop, implement and maintain database systems using SQL. Write complex SQL queries for integration with applications. Develop and maintain data models (Conceptual, physical and logical) to meet organisational needs. ? Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer??s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ? Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ? ? Mandatory Skills: Startburst. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
0.0 - 4.0 years
3 - 5 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver/ No./ Performance Parameter/ Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Hadoop. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
4.0 - 7.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Do Research, design, develop, and modify computer vision and machine learning. algorithms and models, leveraging experience with technologies such as Caffe, Torch, or TensorFlow. Shape product strategy for highly contextualized applied ML/AI solutions by engaging with customers, solution teams, discovery workshops and prototyping initiatives. Help build a high-impact ML/AI team by supporting recruitment, training and development of team members. Serve as evangelist by engaging in the broader ML/AI community through research, speaking/teaching, formal collaborations and/or other channels. Knowledge & Abilities- Designing integrations of and tuning machine learning & computer vision algorithms - Research and prototype techniques and algorithms for object detection and recognition - Convolutional neural networks (CNN) for performing image classification and object detection. Familiarity with Embedded Vision Processing systems - Open source tools & platforms - Statistical Modeling, Data Extraction, Analysis, - Construct, train, evaluate and tune neural networks Mandatory Skills: One or more of the following: Java, C++, Python Deep Learning frameworks such as Caffe OR Torch OR TensorFlow, and image/video vision library like OpenCV, Clarafai, Google Cloud Vision etc Supervised & Unsupervised Learning Developed feature learning, text mining, and prediction models (e.g., deep learning, collaborative filtering, SVM, and random forest) on big data computation platform (Hadoop, Spark, HIVE, and Tableau) *One or more of the followingTableau, Hadoop, Spark, HBase, Kafka Experience- 2-5 years of work or educational experience in Machine Learning or Artificial Intelligence - Creation and application of Machine Learning algorithms to a variety of real-world problems with large datasets. Building scalable machine learning systems and data-driven products working with cross functional teams - Working w/ cloud services like AWS, Microsoft, IBM, and Google Cloud - Working w/ one or more of the followingNatural Language Processing, text understanding, classification, pattern recognition, recommendation systems, targeting systems, ranking systems or similar Nice to Have- Contribution to research communities and/or efforts, including publishing papers at conferences such as NIPS, ICML, ACL, CVPR, etc. Education: BA/BS (advanced degree preferable) in Computer Science, Engineering or related technical field or equivalent practical experience Wipro is an Equal Employment Opportunity employer and makes all employment and employment-related decisions without regard to a person's race, sex, national origin, ancestry, disability, sexual orientation, or any other status protected by applicable law Product and Services Sales Manager ? ? ? ? Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Job Requirement for Offshore Data Engineer (with ML expertise) Work Mode: Remote Base Location: Bengaluru Experience: 5+ Years Technical Skills & Expertise: PySpark & Apache Spark: Extensive experience with PySpark and Spark for big data processing and transformation. Strong understanding of Spark architecture, optimization techniques, and performance tuning. Ability to work with Spark jobs in distributed computing environments like Databricks. Data Mining & Transformation: Hands-on experience in designing and implementing data mining workflows. Expertise in data transformation processes, including ETL (Extract, Transform, Load) pipelines. Experience in large-scale data ingestion, aggregation, and cleaning. Programming Languages: Python & Scala: Proficient in Python for data engineering tasks, including using libraries like Pandas and NumPy. Scala proficiency is preferred for Spark job development. Big Data Concepts: In-depth knowledge of big data frameworks and paradigms, such as distributed file systems, parallel computing, and data partitioning. Big Data Technologies: Cassandra & Hadoop: Experience with NoSQL databases like Cassandra and distributed storage systems like Hadoop. Data Warehousing Tools: Proficiency with Hive for data warehousing solutions and querying. ETL Tools: Experience with Beam architecture and other ETL tools for large-scale data workflows. Cloud Technologies (GCP): Expertise in Google Cloud Platform (GCP), including core services like Cloud Storage, BigQuery, and DataFlow. Experience with DataFlow jobs for batch and stream processing. Familiarity with managing workflows using Airflow for task scheduling and orchestration in GCP. Machine Learning & AI: GenAI Experience: Familiarity with Generative AI and its applications in ML pipelines. ML Model Development: Knowledge of basic ML model building using tools like Pandas, NumPy, and visualization with Matplotlib. ML Ops Pipeline: Experience in managing end-to-end ML Ops pipelines for deploying models in production, particularly LLM (Large Language Models) deployments. RAG Architecture: Understanding and experience in building pipelines using Retrieval-Augmented Generation (RAG) architecture to enhance model performance and output. Tech stack : Spark, Pyspark, Python, Scala, GCP data flow, Data composer (Air flow), ETL, Databricks, Hadoop, Hive, GenAI, ML Modeling basic knowledge, ML Ops experience , LLM deployment, RAG
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Bengaluru
Remote
Job Title: Senior Machine Learning Engineer Work Mode: Remote Base Location: Bengaluru Experience: 5+ Years Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Strong programming skills in Python and experience with ML frameworks. Proficiency in containerization (Docker) and orchestration (Kubernetes) technologies. Solid understanding of CI/CD principles and tools (e.g., Jenkins, GitLab CI, GitHub Actions). Knowledge of data engineering concepts and experience building data pipelines. Strong understandings on Computational, Storage and Orchestration resources on cloud platforms. Deploying and managing ML models especially on GCP (cloud platform agnostic though) services such as Cloud Run, Cloud Functions, and Vertex AI. Implementing MLOps best practices, including model version tracking, governance, and monitoring for performance degradation and drift. Creating and using benchmarks, metrics, and monitoring to measure and improve services Collaborating with data scientists and engineers to integrate ML workflows from onboarding to decommissioning. Experience with MLOps tools like Kubeflow, MLflow, and Data Version Control (DVC). Manage ML models on any of the following: AWS (SageMaker), Azure (Machine Learning), and GCP (Vertex AI). Tech Stack : Aws or GCP or Azure Experience. (More GCP Specific) must have done Py spark, Databricks is good. ML Experience, Docker and Kubernetes.
Posted 1 week ago
2.0 - 7.0 years
11 - 15 Lacs
Bengaluru
Work from Office
About The Role Job TitleAI Scientist Position Overview: We are seeking a talented and experienced Core AI Algorithm Developer to join our Lab45 AI Platform Team, Wipro. We are looking for candidates with 4 to 10 years of hands-on experience in developing cutting-edge AI algorithms, such as in Generative AI, LLM, Deep Learning, Unsupervised AI, etc. along with expertise in Python, TensorFlow, PyTorch, PySpark, distributed computing, Statistics, and cloud technologies. Candidate should have strong foundation in AI and good coding skills. Key Responsibilities: Develop and implement state-of-the-art AI algorithms and models to solve complex problems in diverse domains. Collaborate with cross-functional teams to understand business requirements and translate them into scalable AI production-grade solutions. Work with large datasets to extract insights, optimize algorithms, and enhance model performance. Contribute to the creation of intellectual property (IP) through patents, research papers, and innovative solutions. Stay abreast of the latest advancements in AI research and technologies and apply them to enhance our AI offerings. Collaborate with cross-functional teams to understand business requirements, gather feedback, and iterate on AI solutions. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams. Qualifications: Master's, or Ph.D. degree (preferred) in Computer Science, Artificial Intelligence, Machine Learning, or related field. 4 to 10 years of proven experience in developing cutting-edge AI algorithms and solutions. Strong proficiency in Python programming and familiarity with TensorFlow, PyTorch, PySpark, etc. Experience with distributed computing and cloud platforms (e.g., Azure, AWS, GCP). Demonstrated ability to work with large datasets and optimize algorithms for scalability and efficiency. Excellent problem-solving skills and a strong understanding of AI concepts and techniques. Proven track record of delivering high-quality, innovative solutions and contributing to IP creation (e.g., patents, research papers). Strong communication and collaboration skills, with the ability to work effectively in a team environment.
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Apps Support Sr Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new techniques and the improvement of processes and work-flow for the area or function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. Evaluates moderately complex and variable issues with substantial potential impact, where development of an approach/taking of an action involves weighing various alternatives and balancing potentially conflicting situations using multiple sources of information. Requires good analytical skills in order to filter, prioritize and validate potentially complex and dynamic material from multiple sources. Strong communication and diplomacy skills are required. Regularly assumes informal/formal leadership role within teams. Involved in coaching and training of new recruits. Significant impact in terms of project size, geography, etc. by influencing decisions through advice, counsel and/or facilitating services to others in area of specialization. Work and performance of all teams in the area are directly affected by the performance of the individual. 6-8 years of strong Application production support experience in the financial industry Experience using call/ticketing software Hadoop/Big Data Platform Working knowledge of various components and technologies under Cloudera distribution like HDFS, Hive, Impala, Spark, YARN, Sentry, Oozie, Kafka. Very good knowledge on analyzing the bottlenecks on the cluster - performance tuning, effective resource usage, capacity planning, investigating. Perform daily performance monitoring of the cluster - Implement best practices, ensure cluster staility and create/analyze performance metrics. Hands-on experience in supporting applications built on Hadoop. Linux 4 - 6 years of experience Database Good SQL experience in any of the RDBMS. Scheduler Autosys / CONTROL-M or other schedulers will be of added advantage. Programming Languages UNIX shell scripting, Python / PERL will be of added advantage. Other Applications Knowledge / working experience of ITRS Active Console/other monitoring tools. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Support ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
15 - 22 Lacs
Bengaluru
Remote
Role & responsibilities Design real-time data pipelines for structured and unstructured sources. Collaborate with analysts and data scientists to create impactful data solutions. Continuously improve data infrastructure based on team feedback. Take full ownership of complex data problems and iterate quickly. Promote strong documentation and engineering best practices. Monitor, detect, and fix data quality issues with custom tools. Preferred candidate profile Experience with big data tools like Spark, Hadoop, Hive, and Kafka. Proficient in SQL and working with relational databases. Hands-on experience with cloud platforms (AWS, GCP, or Azure). Familiar with workflow tools like Airflow.
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Team Summary: The Risk and Identity Solutions (RaIS) team provides risk management services for banks, merchants, and other payment networks. Machine learning and AI models are the heart of the real-time insights used by our clients to manage risk. Created by the Visa Predictive Models (VPM) team, continual improvement and efficient deployment of these models is essential for our future success. To support our rapidly growing suite of predictive models we are looking for engineers who are passionate about managing large volumes of data, creating efficient, automated processes and standardizing ML/AI tools. Primary responsibilities Possess a strong understanding of data interpretation, and the ability to effectively represent data using appropriate visualization techniques to deliver actionable insights. Focus on the user experience to design interactive prototype with strong understanding of business context and data following the industry and Visa best practices. Collect, analyze, transform, and interpret raw data from various sources. Design and develop BI solutions, data models and KPI measures to solve business problems. Ability to create visualizations that are user-friendly, intuitive, and tailored to the needs of the end user, ensuring that the visual elements effectively convey the intended message. Develop and maintain interactive dashboards and reports using BI tools such as Power BI using visual elements like charts, graphs, maps, visual design principles. Ensure dashboards and reports are functioning correctly, meet user requirements, and provide accurate, up-to-date insights and perform bug triage by systematically testing data visualizations for accuracy and functionality, identifying issues, prioritizing their resolution based on severity and impact, and ensuring all bugs are fixed in a timely manner. Optimize dashboard performance by enhancing data processing speeds, improving query performance, and refining data models to ensure efficient, reliable, and timely data retrieval and analysis for business intelligence applications. Ensure the security of data and BI solutions, implement data security measures, complying with all relevant regulations and best practices. Set up and maintain the data visualization platform, manage access controls, and ensure system's overall health and performance using usage reports. Document all processes, methodologies, and instructions related to the BI solutions, create comprehensive and accessible documentation, conduct end-user training sessions, and ensure all documentation is consistently updated and available to relevant stakeholders. Technical skills (Must have) Expertise in LOD( Level of Detail), DAX(Data Analysis Expressions), Power Query, M language, Tableau Prep to create measures and transform data. Proficiency in data visualization tools such as Power BI. Advanced proficiency in SQL, including a deep understanding of queries, joins, stored procedures, triggers, and views, as well as experience in optimizing SQL for improved performance and efficiency. Comfortable with creating and maintaining database schemas, indexes, and writing complex SQL scripts for data analysis and extraction Experience in interacting with data warehouses and data lakes, utilizing tools like pyspark, Apache Hadoop Amazon Redshift, snowflake and Amazon S3 to ingest and extract data for insights. Non-technical skills Experienced in working closely with cross-functional teams and stakeholders to ensure understanding and usability of data visualizations. Continually stays updated with the latest trends and advancements in data visualization techniques and tools. Excellent problem-solving skills and strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Excellent communication and interpersonal skills for managing relationships with stakeholders, strong presentation skills to effectively communicate data insights and visualizations to diverse audiences, with the ability to tailor the presentation to the audience's level of expertise. Ability to plan, prioritize, and manage time effectively, keep track of tasks and deadlines, maintain a tidy and systematic work environment, and coordinate resources to achieve goals in a timely and efficient manner. Take full responsibility for the BI projects, ensuring accurate and timely delivery of insights, and addressing any issues or inaccuracies in the data promptly and effectively. Qualifications 5+ years of relevant work experience with a bachelor’s degree or at least 2 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 0 years of work experience with a PhD, OR 8+ years of relevant work experience. Bachelor’s degree in computer science, Engineering, or a related field. Proven experience as a BI Engineer / Visualization developer or similar role for 6+ years. Show more Show less
Posted 1 week ago
1.0 - 3.0 years
2 - 5 Lacs
Chennai
Work from Office
Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows
Posted 1 week ago
7.0 - 12.0 years
15 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are seeking a highly motivated and technically proficient Big Data Engineer to join our innovative team and contribute to the development of a next-generation Big Data platform. This is an exciting opportunity to work on cutting-edge solutions handling petabyte-scale datasets for one of the worlds largest technology companies headquartered in Silicon Valley . Key Responsibilities Design and develop scalable Big Data analytical applications. Build and optimize complex ETL pipelines and data processing frameworks. Implement large-scale, near real-time streaming data processing systems. Continuously enhance and support the project’s codebase, CI/CD pipelines, and deployment infrastructure. Collaborate with a team of top-tier engineers to build highly performant and resilient data systems using the latest Big Data technologies. Required Qualifications Strong programming skills in Scala , Java , or Python (Scala preferred). Hands-on experience with Apache Spark , Hadoop , and Hive . Proficiency in stream processing technologies such as Kafka , Spark Streaming , or Akka Streams . Solid understanding of data quality, validation, and quality engineering practices. Experience with Git and version control best practices. Ability to rapidly learn and apply new tools, frameworks, and technologies. Preferred Qualifications Strong experience with AWS Cloud services (e.g., EMR, S3, Lambda, Glue, Redshift). Familiarity with Unix-based operating systems and shell scripting (bash, ssh, grep, etc.). Experience with GitHub-based development workflows and pull request processes. Knowledge of JVM-based build tools such as SBT , Maven , or Gradle . What We Offer Opportunity to work on bleeding-edge Big Data projects with global impact. A collaborative and intellectually stimulating environment. Competitive compensation and performance-based incentives. Flexible work schedule. Comprehensive benefits package including medical insurance and fitness programs. Regular corporate social events and team-building activities. Ongoing professional growth and career development opportunities. Access to a modern and well-equipped office space.
Posted 1 week ago
5.0 - 8.0 years
8 - 15 Lacs
Hyderabad
Work from Office
1. Adobe Experience Platform (AEP) Expertise Understanding of AEPs architecture, data modeling, and functionalities like Real-Time Customer Profile, Data Lake, and Customer Journey Analytics. 2. Adobe Experience Cloud (AEC) Products Hands-on experience with Adobe tools like Adobe Customer Journey Analytics (CJA), Adobe Experience Manager (AEM), Adobe Target, and Adobe Real-Time CDP. 3. Data Ingestion & Transformation Experience with ETL (Extract, Transform, Load) processes, data schemas (XDM - Experience Data Model), and integration of multiple data sources into AEP. 4. Query and Data Management Strong skills in SQL, NoSQL, and Adobe Query Service to process and analyze customer data. 5. Identity Resolution & Identity Graph Knowledge of identity stitching and how AEP manages customer identities across different data sources. 6. Tag Management & SDKs Experience in Adobe Launch (Tags) and Adobe Mobile SDK for data collection and event tracking. 7. Streaming & Batch Data Processing Ability to work with APIs, event-driven architectures, and batch data ingestion into AEP. 8. Cloud & Big Data Technologies Experience with AWS, Azure, Google Cloud, Kafka, Spark, or Hadoop is an added advantage. 9. Scripting & Development Proficiency in JavaScript, Python, or Java for data transformations and API integrations. 10. Adobe Experience Platform APIs Hands-on experience with AEP APIs to automate and extend platform capabilities. Optimize, Automate, and Scale AEP implementations CI/CD & DevOps API Development Cloud Platforms Data Governance & Compliance Identity & Access Management (IAM) Requirements: Collecting and integrating data from various sources. Centralizing and standardizing customer data for consistency. Segmenting customers to target specific groups effectively. Designing and optimizing customer journeys to enhance engagement. Analyzing data to gain insights and improve marketing strategies. Delivering personalized experiences using data science and machine learning. Ensuring compliance and security of customer data.
Posted 1 week ago
3.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Skills: Microsoft Azure, Hadoop, Spark, Databricks, Airflow, Kafka, Py spark RequirmentsExperience working with distributed technology tools for developing Batch and Streaming pipelines using. SQL, Spark, Python Airflow Scala Kafka Experience in Cloud Computing, e.g., AWS, GCP, Azure, etc. Able to quickly pick up new programming languages, technologies, and frameworks. Strong skills building positive relationships across Product and Engineering. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc. Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture Experience working with Data platforms, including EMR, Airflow, Data bricks (Data Engineering & Delta Lake components) Experience working in Agile and Scrum development process. Experience in EMR/ EC2, Data bricks etc. Experience working with Data warehousing tools, including SQL database, Presto, and Snowflake Experience architecting data product in Streaming, Server less and Microservices Architecture and platform.
Posted 1 week ago
4.0 - 9.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Senior Engineer - Cloud Services and Software. 1. 4+ Years of strong experience in Microsoft Dot Net along with AWS cloud experience. 2. At least 1+ year in Dot Net Core. 3. Should have strong experience in SQL (MySQL, SQL SERVER) and NoSQL databases 4. Good to have knowledge of Design Patterns, SOLID principles, & CLEAN architecture. 5. Good to have experience in application migration from dot net to dot net core 6. Good to have experience in CI/CD like sourceTree 7. Good to have experience in JIRA/Agile 8. Should be ready to learn new technologies 9. Strong analytical and problem solving skills 10. Good communication skills and client handling skills
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.
These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.
The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.
In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.
In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.
As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.