Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
2 - 8 Lacs
Hyderābād
Remote
Cust Experience Engineer 2 Hyderabad, Telangana, India Date posted Jul 29, 2025 Job number 1850854 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Program Management Discipline Customer Experience Engineering Employment type Full-Time Overview We are the ACES Strategic team (Advanced Cloud Engineering & Supportability), a global engineering team in Azure CXP and we are focused on Strategic Azure Customers. We are customer-obsessed problem-solvers. We orchestrate and drive deep engagements in areas like Incident Management, Problem Management, Support, Resiliency, and empowering the customers. We represent the customer and amplify customer voice with Azure Engineering connecting to the quality vision for Azure. We innovate and find ways to scale our learning across our customer base. Diversity and inclusion are central to who we are, how we work, and what we enable our customers to achieve. We know that empowering our customers starts with empowering our team to show up authentically, work in ways that are best for them, and achieve their career goals. Every minute of every day, customers stake their entire business and reputation on the Microsoft Cloud. The Azure Customer Experience (CXP) team believes that when we meet our high standards for quality and reliability, our customers win. If we falter, our customers fail their end-customers. Our vision is to turn Microsoft Cloud customers into fans. Are you constantly customer-obsessed and passionate about solving complex technical problems? Do you take pride in enhancing customer experience through innovation? If the answer is Yes, then join us and surround yourself with people who are passionate about cloud computing and believe that extraordinary support is critical to customer success. As a customer focused Advanced Cloud Engineer, you are the primary engineering contact accountable for your customer’s support experience on Azure. You will drive resolution of critical and complex problems, support key customer projects on Azure and be the voice of the customer within Azure. In this role, you will work in partnership with Customer Success Account Managers, Cloud Solution Architects, Technical Support Engineers, and Azure engineering with our mission to turn Azure customers into fans with world-class engineering-led support experience. This role is flexible in that you can work up to 50% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor’s degree in engineering, Computer Science, or related field AND 6+ years of experience in Software industry experience related to technology OR equivalent experience. 4 years of demonstrated IT experience supporting and troubleshooting enterprise level, mission-critical applications resolving complex issues/situations and driving technical resolution across cross-functional organizations. 2+ years experience in an external customer / client facing role. 2+ years of experience working on cloud computing technologies. Experience with being on-call. Technical Skills: Cloud computing technologies. Demonstrated hands on experience in one or more of the following: Core IaaS: Compute, Storage, Networking, High Availability. Data Platform and Bigdata: SQL Server, Azure SQL DB, HDInsight/Hadoop, Machine Learning, Azure Stream Analytics, Azure Data Factory / Data Bricks. Azure PaaS Services: Redis Cache, Service Bus, Event Hub, Cloud Service, IoT suite, Mobile Apps, etc. Experience in Monitoring related technologies like Azure Monitor, Log Analytics, Resource Graph, Azure Alerts, Network Watcher, Grafana, Ambari, Prometheus, Datadog, Confluent, etc. Experience in deploying, configuring, and operating enterprise Monitoring solutions. Experience in one or more automation languages (PowerShell, Python, C#, Open Source). Communication skills: ability to empathize with customers and convey confidence. Able to explain highly technical issues to varied audiences. Able to prioritize and advocate customer’s needs to the proper channels. Take ownership and work towards a resolution. Customer Obsession: Passion for customers and focus on delivering the right customer experience. Growth Mindset: Openness and ability to learn new skills and technologies in a fast-paced environment. The ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Responsibilities Technically Oriented With minimal oversight, track customer incidents, engage with strategic customers and partners to understand issues, contribute to troubleshooting through diagnostics, communicate progress and next steps to customers with a focus on reducing time taken to mitigate critical incidents. Use engineering and support tools, customer telemetry and/or direct customer input to detect and flag issues in the products or with the customer usage of the products. Help customers stay current with best practices by sharing content. Identify and leverage developmental opportunities across product areas and business processes (e.g., mentorships, shadowing, trainings) for professional growth and to develop technical skills to resolve customer issues. Customer Solution Lifecycle Management With minimal guidance, serve as a connecting point between the product team and customers throughout the engagement life cycle, engage with customers to understand their business and availability needs, develop and offer proactive guidance on designing configurations and deploying solutions on Azure with support from subject matter experts. Handle critical escalations on customer issues from the customer or support or field teams, conduct impact analysis, help customers with answers to their technical questions, and serve as an escalation resource in areas of subject matter expertise. Conduct in-depth root cause analysis of issues and translates findings into opportunities for improvement and track and drive them as repair items. Relationship/Experience Management Act as the voice of customers and channel product feedback from strategic customers to product groups. Identify customer usage patterns and drive resolutions on reoccurring issues with product groups. Close the feedback loop with the customers on product features. With minimal guidance, partner with other teams (e.g., program managers, software engineers, product, customer service support teams), prioritize, unblock, and resolve critical customer issues. Collaborate with stakeholders to support delivery of solutions to strategic customers and resolving customer issues. Embody our culture and values. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 6 days ago
175.0 years
8 - 10 Lacs
Gurgaon
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? In this role, the person will report to the Product Manager – Travel & Lifestyle Services, this role is an exciting opportunity for a PO/Analyst, the person will be working on data related products and to maintain quality of data for TLS in the Big Data Platform Cornerstone. Minimum Qualifications 5+ years’ experience in travel domain or minimum background in financial domain At least 5 years of experience in technology product management or data-related products. At least 5 years of experience in Software Architecture and Software Development. 3 years’ experience with SQL Experience with agile methodologies, i.e., rally, agile. An ability to solve complex problems and a highly analytical approach. Demonstrate the ability to learn and be curious to understand and master the travel domain. You are excited and passionate for the travel domain. Self-starter with the ability to think creatively and strategically Strong communication and stakeholder management skills Excellent communication skills with the ability to engage, influence, and inspire partners to drive collaboration and alignment. Demonstrate the ability to maintain a positive attitude and sense of humor in the face of chaos and challenges Has a successful record of leading and coordinating business, delivery, and technology teams to define, prioritize, and deliver on a product roadmap Strong product management skills that will take full ownership from analysis through implementation. High degree of organization, individual initiative, and personal accountability. Platform Knowledge Experience working w/ Hadoop and Big Data Platform – Cornerstone, Google Cloud Platform (GCP) Proficient in Microsoft Suit, Power BI, Tableau, and SQL Education Bachelors in related fields (Computer Science, Information Technology, Engineer, Electronics) Preferred Qualifications Masters in related in fields (Computer Science, Information Technology, Engineer, Electronics) We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 6 days ago
1.0 years
0 Lacs
Gurgaon
On-site
About NCR Atleos NCR Atleos, headquartered in Atlanta, is a leader in expanding financial access. Our dedicated 20,000 employees optimize the branch, improve operational efficiency and maximize self-service availability for financial institutions and retailers across the globe. We are on the lookout for a data scientist who is passionate about data and problem-solving. The ideal candidate who is eager to learn, can work collaboratively with various teams, and is committed to staying updated with the latest developments in data science and machine learning. We value individuals who are proactive, detail-oriented, and ready to contribute their expertise to our diverse analytical projects. If you are someone who thrives in a challenging environment and is looking for a role where you can make a significant impact, we would love to hear from you. To be successful in this position, a candidate’s specific skills include, but are not limited to: Partnership: Work closely with colleagues from Sales, Operations, Product, Finance, and others to understand their domain, processes and come up with solutions for their problems and tools to make their day-to-day operations efficient and effective Strategic and Analytical Orientation: Experienced in decision making and problem solving based on analytics. Conceptual thinking for framework creation combined with strong quantitative orientation to solve complex problems with rigorous analytics and monitoring Strong Technical/Programming Skills: Orientation to & ability to code in languages such as SQL, Python, integrate structured & unstructured internal & external data sources to create user interfaces, adept at building visualizations using UI tools Machine Learning/Statistical Modeling: Training and hands on experience either through coursework and/or professional experience. Strong Communication Skills: Strong written and oral communication skills coupled with skills to influence and drive agreement through intellectual, interpersonal and negotiation skills Execution Focus: Build and manage execution plans of business intent, requirements, execute against the strategy and monitor results Results Focus: Focused on achieving short and long-term goals. Able to drive and execute an agenda in an uncertain, fluid environment with minimal supervision Strong business judgment, leadership, and integrity: Tenacious decision maker able to bring healthy, aggressive yet responsible approach to business Product Maker: The right candidate will be self-motivated and have a “Product Maker” mindset. Strong conceptual thinking to understand the business and ability to grasp analytical & technical concepts What to Expect? Developing processes and tools to monitor and analyze model performance and data accuracy Leading ongoing reviews of business processes and developing optimization strategies Conducting advanced statistical analysis to provide actionable insights, identify trends, and measure performance Collaborating with stakeholders and teams to implement and evaluate the data science models and outcomes Working with data engineering teams to validate data ETL process as data environment transition to cloud. Proposing solutions and strategies to business challenges Presenting information using data visualization techniques Minimum Qualifications: Bachelor’s degree in a technical discipline with 1+ years of experience or advanced degree with commensurate level of work experience Expertise in multiple programming languages including Python, R, and SQL. Deep understanding of machine learning algorithms and principles. Experience with cloud platforms like AWS or Google Cloud. Proficiency in big data frameworks like Hadoop or Spark. Ability to perform complex data analysis and build predictive models. Strong skills in data visualization and presentation. Offers of employment are conditional upon passage of screening criteria applicable to the job. EEO Statement NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law. Statement to Third Party Agencies To ALL recruitment agencies: NCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.
Posted 6 days ago
1.0 years
3 - 4 Lacs
Delhi
On-site
Madrid Software is hiring.Company Description:Madrid Software is a leading Ed-tech company in India, providing online and classroom courses in Business Analytics, Data Science with R and Python, Data Analytics with R and Python, Big Data Hadoop, Machine Learning, Artificial Intelligence, Digital Marketing, Software Testing, UI/UX Design, and Amazon Web Services (AWS). The institute was established in May 2011 by Ex Cognizant Employees and has successfully trained over 20000+ IT professionals.Position: Senior Admission CounsellorLocation: Saket (onsite)Working Days: 6 Days WorkingTiming: 10:00 a.m.–6:30 p.m.Job Overview:This is a full-time on-site role as a Senior Admissions Counsellor located in Saket. The Admissions Counsellor will be responsible for day-to-day tasks such as communicating with potential students, providing customer service, selling courses, and educating students about the institute and its courses.Key Responsibility:- Counsel the Students and Working Professionals.- Responsible for converting any query into Admission.- Ability to work independently and as a part of team.- Proficiency in CRM – Lead Square & Zoho.- Familiarity with Ed-tech industry and its trends is preferred.- Efficient follow-up with customers for closure and Admission purposes.- Lead Management through telephony engagements/calls, managing, tracking, and timely follow-up on leads in the CRM.Requirements:- Should have 1+ Year of experience in Admission Counselling.- Candidate should have the ability to convert leads into sales.- Should have Good knowledge of Technical Courses (Data Science & Data Analytics).- Excellent Interpersonal and Communication skills.- Familiarity with Ed-tech industry and its trends is preferred.- 6 Days Working.How to Apply:Interested candidates can share their updated CV at madridplacement@gmail.com or call at 9871874180 Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Benefits: Cell phone reimbursement Schedule: Day shift Language: English (Preferred) Work Location: In person
Posted 6 days ago
0.0 - 5.0 years
5 - 19 Lacs
HSR Layout, Bengaluru, Karnataka
On-site
Data Engineering / Tech Lead – Experience: 4+ years About Company InspironLabs is a GenAI-driven software services company focused on building AI-powered, scalable digital solutions. Our skilled team delivers intelligent applications tailored to specific business challenges, using AI and Generative AI (GenAI) to accelerate innovation. Key strengths include: AI & GenAI Focus – Harnessing AI and Generative AI to deliver smarter solutions. Scalable Tech Stack – Building future-ready systems for performance and resilience. Proven Enterprise Experience – Deploying solutions across industries and geographies. To know more, visit: www.inspironlabs.com Key Responsibilities • Design, implement, and maintain robust data pipelines. • Collaborate with data scientists and analysts for integrated solutions. • Mentor junior engineers and manage project timelines. Required Skills • Experience with Spark, Hadoop, Kafka. • Expertise in SQL, Python, cloud data platforms (AWS/GCP/Azure). • Hands-on with orchestration tools like Airflow, DBT. Qualifications Experience: 4 to 5 years in data engineering roles. Bachelor’s in Computer Science, Engineering, or related field. Place of Work In Office – Bangalore Job Type Full Time Job Type: Full-time Pay: ₹560,716.53 - ₹1,944,670.55 per year Benefits: Flexible schedule Health insurance Paid sick time Paid time off Provident Fund Ability to commute/relocate: HSR Layout, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Please mention your notice period? What is your current CTC? Work Location: In person
Posted 6 days ago
5.0 years
12 - 18 Lacs
Mohali
On-site
Technical Support Engineer Job Summary The Technical Support Engineer (TSE) acts as an SME for a book of enterprise accounts. The TSE is responsible for answering all technical questions within both standard and custom deployment environments and assisting with supported LTS upgrades. The TSE is also responsible for peer training and development, personal continued education, and contributing to our reference documentation. They will coordinate closely with Support leadership, Engineering, Product and Accounts teams to ensure our customers receive a value driven enterprise experience. A TSE is able to work independently, with minimal guidance, and demonstrates an expert degree of proficiency in both SEP and Galaxy. Responsibilities ● Technical Support: ○ Provide support for standard and custom deployments ○ Answer break/fix and non-break/fix technical questions through SFDC ticketing system ○ Efficiently reproduce reported issues by leveraging tools (minikube, minitrino, docker-compose, etc.), identify root causes, and provide solutions ○ Open SEP and Galaxy bug reports in Jira and feature requests in Aha! ● LTS Upgrades: ○ Provide upgrade support upon customer request ■ Customer must be on a supported LTS version at the time of request ■ TSE must communicate unsupported LTS requests to the Account team as these require PS services ● Monthly Technical check-ins ○ Conduct regularly scheduled technical check-ins with each BU ■ Discuss open support tickets, provide updates on product bugs and provide best practice recommendations based on your observations and ticket trends ■ Responsible for ensuring customer environments are on supported LTS versions ● Knowledge Sharing/Technical Enablement: Knowledge exchange and continued technical enablement are crucial for the development of our team and the customer experience. It's essential that we keep our product expertise and documentation current and that all team members have access to information. ○ Contribute to our reference documentation ○ Lead peer training ○ Consultant to our content teams ○ Own your personal technical education journey ● Project Involvement ○ Contribute to or drive components of departmental and cross functional initiatives ● Partner with Leadership ○ Identify areas of opportunity with potential solutions for inefficiencies or obstacles within the team and cross-functionally ○ Provide feedback to your manager on continued ed. opportunities, project ideas, etc. Requirements ● 5+ years of support experience ● 3+ years of Big Data, Docker, Kubernetes and cloud technologies experience Skills ● Big Data (Teradata, Hadoop, Data Lakes, Spark) ● Docker and Kubernetes ● Cloud technologies (AWS, Azure, GCP) ● Security - Authentication (LDAP, OAuth2.0) and Authorization technologies ● SSL/TLS ● Linux Skills ● DBMS Concepts/SQL Exposure Languages: SQL, Java, Python, Bash Job Type: Full-time Pay: ₹1,200,000.00 - ₹1,800,000.00 per year Benefits: Cell phone reimbursement Health insurance Internet reimbursement Paid sick time Provident Fund Work Location: In person
Posted 6 days ago
0 years
0 Lacs
India
Remote
Role: NIFI Developer Notice period: Notice Serving Candidates or Immediate Joiners Preferred Client: Marriott Payroll: Dminds Work Mode: Remote I nterview Mode: Virtual We’re looking for someone who has built deployed and maintained NIFI clusters. Roles & Responsibilities: ·Implemented solutions utilizing Advanced AWS Components: EMR, EC2, etc integrated with Big Data/Hadoop Distribution Frameworks: Zookeeper, Yarn, Spark, Scala, NiFi etc. ·Designed and Implemented Spark Jobs to be deployed and run on existing Active clusters. ·Configured Postgres Database on EC2 instances and made sure application that was created is up and running, Trouble Shooted issues to meet the desired application state. ·Experience in creating and configuring secure VPC, Subnets, and Security Groups through private and public networks. ·Created alarms, alerts, notifications for Spark Jobs to email and slack group message job status and log in CloudWatch. ·NiFi data Pipeline to process large set of data and configured Lookup’s for Data Validation and Integrity. ·generation large set of test data with data integrity using java which used in Development and QA Phase. ·Spark Scala, improving the performance and optimized of the existing applications running on EMR cluster. ·Spark Job to Convert CSV data to Custom HL7/FHIR objects using FHIR API’s. ·Deployed SNS, SQS, Lambda function, IAM Roles, Custom Policies, EMR with Spark and Hadoop setup and bootstrap scripts to setup additional software’s needed to perform the job in QA and Production Environment using Terraform Scripts. ·Spark Job to perform Change Data Capture (CDC) on Postgres Tables and updated target tables using JDBC properties. ·Kafka Publisher integrated in spark job to capture errors from Spark Application and push into Postgres table. ·extensively on building Nifi data pipelines in docker container environment in development phase. ·Devops team to Clusterize NIFI Pipeline on EC2 nodes integrated with Spark, Kafka, Postgres running on other instances using SSL handshakes in QA and Production Environments.
Posted 6 days ago
6.0 years
15 - 18 Lacs
Indore
On-site
Location: Indore Experience: 6+ Years Work Type : Hybrid Notice Period : 0-30 Days joiners We are hiring for a Digital Transformation Consulting firm that specializes in the Advisory and implementation of AI, Automation, and Analytics strategies for the Healthcare providers. The company is headquartered in NJ, USA and its India office is in Indore, MP. Job Description: We are seeking a highly skilled Tech Lead with expertise in database management, data warehousing, and ETL pipelines to drive the data initiatives in the company. The ideal candidate will lead a team of developers, architects, and data engineers to design, develop, and optimize data solutions. This role requires hands-on experience in database technologies, data modeling, ETL processes, and cloud-based data platforms. Key Responsibilities: Lead the design, development, and maintenance of scalable database, data warehouse, and ETL solutions. Define best practices for data architecture, modeling, and governance. Oversee data integration, transformation, and migration strategies. Ensure high availability, performance tuning, and optimization of databases and ETL pipelines. Implement data security, compliance, and backup strategies. Required Skills & Qualifications: 6+ years of experience in database and data engineering roles. Strong expertise in SQL, NoSQL, and relational database management systems (RDBMS). Hands-on experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery). Deep understanding of ETL tools and frameworks (e.g., Apache Airflow, Talend, Informatica). Experience with cloud data platforms (AWS, Azure, GCP). Proficiency in programming/scripting languages (Python, SQL, Shell scripting). Strong problem-solving, leadership, and communication skills. Preferred Skills (Good to Have): Experience with big data technologies (Hadoop, Spark, Kafka). Knowledge of real-time data processing. Exposure to AI/ML technologies and working with ML algorithms Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Schedule: Day shift Application Question(s): We must fill this position urgently. Can you start immediately? Have you held a lead role in the past? Experience: Extract, Transform, Load (ETL): 6 years (Required) Python: 5 years (Required) big data technologies (Hadoop, Spark, Kafka): 6 years (Required) Snowflake: 6 years (Required) Data warehouse: 6 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person
Posted 6 days ago
1.0 - 4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of DECISION SCIENCE JUNIOR ANALYST Principal Responsibilities To support the Business by providing vital input for strategic planning by the senior management which enables effective decision making along with addressing unforeseen challenges. The team leverages the best of data and analytics capabilities to enable smarter decisions and drive profitable growth. The team supports various domains ranging from Regulatory, Operations, Procurement, Human Resources, and Financial Crime Risk. It provides support to various business groups and the job involves data analysis, model and strategy development & implementation, Business Intelligence, reporting and data management The team addresses range of business problems which cover areas of business growth, improving customer experience, limiting risk exposure, capital quantification, enhancing internal business processes etc. Proactively identify key emerging compliance risks across all RC categories and interface appropriately with other RC teams and senior management. To provide greater understanding of the potential impact and associated consequences / failings of significant new or emerging risks. & provide innovative and effective solutions based on SME knowledge that assists the Business / Function. Proposing, managing and tracking the resolution of subsequent risk management actions. Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities. Against this period of considerable regulatory change and development, and as regulators develop their own understanding of compliance risk management, the role holder must maintain a strong knowledge and understanding of regulatory development and the evolution of the compliance risk framework, risk appetite and risk assessment methodology. Deliver repeatable and scalable analytics through the semi-automation of L1 Financial Crime Risk and Regulatory Compliance Risk Assurance controls testing. Here, Compliance Assurance will develop and run analytics on data sets which will contain personal information such as customer and employee data. Requirements Bachelor’s degree from reputed university in statistics, economics or any other quantitative fields. Fresher with educational background relevant in Data Science or certified in Data science courses 1-4 years of Experience in the field of Automation & Analytics Worked on Proof of Concept or Case study solving complex business problems using data Strong analytical skills with business analysis experience or equivalent. Basic knowledge and understanding of financial-services/ banking-operations is a good to have. Delivery focused, demonstrating an ability to work under pressure and within tight deadlines Basic knowledge of working in Python and other Data Science Tools & in visualization tools such as QlikSense/Other visualization tools. Experience in SQL/ETL tools is an added advantage. Understanding of big data tools: Teradata, Hadoop, etc & adopting cloud technologies like GCP/AWS/Azure is good to have Experience in data science and other machine learning algorithms (For e.g.- Regression, Classification) is an added advantage Basic knowledge in Data Engineering skills – Building data pipelines using modern tools / libraries (Spark or similar) You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued By HSBC Electronic Data Processing (India) Private LTD***
Posted 6 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Min Experience:4.0 years Max Experience:8.0 years Skills:Kubenetes,Pyspark, docker, Gitlab,dbt, python,Reliability, Angular 2,Grafana,AWS,Monitoring and Observability Location:PuneJob description: Company Overview Bridgenext is a Global consulting company that provides technology-empowered business solutions for world-class organizations. Our Global Workforce of over 800 consultants provide best in class services to our clients to realize their digital transformation journey. Our clients span the emerging, mid-market and enterprise space. With multiple offices worldwide, we are uniquely positioned to deliver digital solutions to our clients leveraging Microsoft, Java and Open Source with a focus on Mobility, Cloud, Data Engineering and Intelligent Automation. Emtec’s singular mission is to create “Clients for Life” - long-term relationships that deliver rapid, meaningful, and lasting business value. At Bridgenext, we have a unique blend of Corporate and Entrepreneurial cultures. This is where you would have an opportunity to drive business value for clients while you innovate and continue to grow and have fun while doing it. You would work with team members who are vibrant, smart and passionate and they bring their passion to all that they do – whether it’s learning, giving back to our communities or always going the extra mile for our client. Position Description We are looking for members with hands-on Data Engineering experience who will work on the internal and customer-based projects for Bridgenext. We are looking for someone who cares about the quality of code and who is passionate about providing the best solution to meet the client needs and anticipate their future needs based on an understanding of the market. Someone who worked on Hadoop projects including processing and data representation using various AWS Services. Must Have Skills: · 4-8 years of overall experience · Strong programming experience with Python and ability to write modular code following best practices in python which is backed by unit tests with high degree of coverage. · Knowledge of source control(Git/Gitlabs) · Understanding of deployment patterns along with knowledge of CI/CD and build tools · Knowledge of Kubernetes concepts and commands is a must · Knowledge of monitoring and alerting tools like Grafana, Open telemetry is a must · Knowledge of Astro/Airflow is plus · Knowledge of data governance is a plus · Experience with Cloud providers, preferably AWS · Experience with PySpark, Snowflake and DBT good to have. Professional Skills: Solid written, verbal, and presentation communication skills Strong team and individual player Maintains composure during all types of situations and is collaborative by nature High standards of professionalism, consistently producing high-quality results Self-sufficient, independent requiring very little supervision or intervention Demonstrate flexibility and openness to bring creative solutions to address issues
Posted 6 days ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage
Posted 6 days ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 6 days ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 6 days ago
0.0 - 15.0 years
83 - 104 Lacs
Delhi, Delhi
On-site
Job Title: Data Architect (Leadership Role) Company : Wingify Location : Delhi (Outstation Candidates Allowed) Experience Required : 10 – 15 years Working Days : 5 days/week Budget : 83 Lakh to 1.04 Cr About Us We are a fast-growing product-based tech company known for its flagship product VWO—a widely adopted A/B testing platform used by over 4,000 businesses globally, including Target, Disney, Sears, and Tinkoff Bank. The team is self-organizing, highly creative, and passionate about data, tech, and continuous innovation. About us Company Size: Mid-Sized Industry : Consumer Internet, Technology, Consulting Role & Responsibilities Lead and mentor a team of Data Engineers, ensuring performance and career development. Architect scalable and reliable data infrastructure with high availability. Define and implement data governance frameworks, compliance, and best practices. Collaborate cross-functionally to execute the organization’s data roadmap. Optimize data processing workflows for scalability and cost efficiency. Ensure data quality, privacy, and security across platforms. Drive innovation and technical excellence across the data engineering function. Ideal Candidate Must-Haves Experience : 10+ years in software/data engineering roles. At least 2–3+ years in a leadership role managing teams of 5+ Data Engineers. Proven hands-on experience setting up data engineering systems from scratch (0 → 1 stage) in high-growth B2B product companies. Technical Expertise: Strong in Java (preferred), or Python, Node.js, GoLang. Expertise in big data tools: Apache Spark, Kafka, Hadoop, Hive, Airflow, Presto, HDFS. Strong design experience in High-Level Design (HLD) and Low-Level Design (LLD). Backend frameworks like Spring Boot, Google Guice. Cloud data platforms: AWS, GCP, Azure. Familiarity with data warehousing: Snowflake, Redshift, BigQuery. Databases: Redis, Cassandra, MongoDB, TiDB. DevOps tools: Jenkins, Docker, Kubernetes, Ansible, Chef, Grafana, ELK. Other Skills: Strong understanding of data governance, security, and compliance (GDPR, SOC2, etc.). Proven strategic thinking with ability to align technical architecture to business objectives. Excellent communication, leadership, and stakeholder management. Preferred Qualifications Exposure to Machine Learning infrastructure / MLOps. Experience with real-time data analytics. Strong foundation in algorithms, data structures, and scalable systems. Previous work in SaaS or high-growth startups. Screening Questions Do you have team leadership experience? How many engineers have you led? Have you built a data engineering platform from scratch? Describe the setup. What’s the largest data scale you’ve worked with and where? Are you open to continuing hands-on coding in this role? Interested candidates applies on deepak.visko@gmail.com or 9238142824 . Job Types: Full-time, Permanent Pay: ₹8,300,000.00 - ₹10,400,000.00 per year Work Location: In person
Posted 6 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Manager, Software Development Engineering leads a team of technical experts in successfully executing technology projects and solutions that align with the strategy and have broad business impact. The Manager, Software Development Engineering will work closely with development teams to identify and understand key features and their underlying functionality while also partnering closely with Product Management and UX Design. They may exercise influence and govern overall end-to-end software development life cycle related activities including management of support and maintenance releases, minor functional releases, and major projects. The Manager, Software Development Engineering will lead & provide technical guidance for process improvement programs while leveraging engineering best practices. In this people leadership role, Managers will recruit, train, motivate, coach, grow and develop Software Development Engineer team members at a variety of levels through their technical expertise and providing continuous feedback to ensure employee expectations, customer needs and product demands are met. About the Role: Lead and manage a team of engineers, providing mentorship and fostering a collaborative environment. Design, implement, and maintain scalable data pipelines and systems to support business analytics and data science initiatives. Collaborate with cross-functional teams to understand data requirements and ensure data solutions align with business goals. Ensure data quality, integrity, and security across all data processes and systems. Drive the adoption of best practices in data engineering, including coding standards, testing, and automation. Evaluate and integrate new technologies and tools to enhance data processing and analytics capabilities. Prepare and present reports on engineering activities, metrics, and project progress to stakeholders. About You: Proficiency in programming languages such as Python, Java, or Scala. Data Engineering with API & any programming language. Strong understanding of APIs and possess forward-looking knowledge of AI/ML tools or models and need to have some knowledge on software architecture. Experience with cloud platforms (e.g., AWS,Google Cloud) and big data technologies (e.g., Hadoop, Spark). Experience with Rest/Odata API's Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication and interpersonal skills. Experience with data warehousing solutions such as BigQuery or snowflakes Familiarity with data visualization tools and techniques. Understanding of machine learning concepts and frameworks. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 6 days ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn’t a buzzword — it’s a way of life. The world of work as we know it is changing and we're looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all. Ready to level-up your career at the company leading workforce transformation in the agentic era? You’re in the right place! Agentforce is the future of AI, and you are the future of Salesforce. As an engineering leader, you will focus on developing the team around you. Bring your technical chops to drive your teams to success around feature delivery and live-site management for a complex cloud infrastructure service. You are as enthusiastic about recruiting and building a team as you are about challenging technical problems that your team will solve. You will also help shape, direct and execute our product vision. You’ll be challenged to blend customer-centric principles, industry-changing innovation, and the reliable delivery of new technologies. You will work directly with engineering, product, and design, to create experiences that reinforce the Salesforce brand by delighting and wowing our customers with highly reliable and available services. Responsibilities Drive the vision of enabling a full suite of Salesforce applications on Google Cloud in collaboration with teams across geographies. Build and lead a team of engineers to deliver cloud framweoks, infrastructure automation tools, workflows, and validation platforms on our public cloud platforms. Solid experience in building and evolving large scale distributed systems to reliably process billions of data points Proactively identify reliability & data quality problems and drive triaging and remediation process. Invest in continuous employee development of a highly technical team by mentoring and coaching engineers and technical leads in the team. Recruit and attract top talent. Drive execution and delivery by collaborating with cross functional teams, architects, product owners and engineers. Experience managing 2+ engineering teams. Experience building services on public cloud platforms like GCP, AWS, Azure Required Skills/Experiences B.S/M.S. in Computer Sciences or equivalent field. 12+ years of relevant experience in software development teams with 5+ years of experience managing teams Passionate, curious, creative, self-starter and approach problems with right methodology and intelligent decisions. Laser focus on impact, balancing effort to value, and getting things done. Experience providing mentorship, technical leadership, and guidance to team members. Strong customer service orientation and a desire to help others succeed. Top notch written and oral communication skills. Desired Skills/Experiences Working knowledge of modern technologies/services on public cloud is desirable Experience with container orchestration systems Kubernetes, Docker, Helios, Fleet Expertise in open source technologies like Elastic Search, Logstash, Kakfa, MongoDB, Hadoop, Spark, Trino/Presto, Hive, Airflow, Splunk Benefits & Perks Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! World-class enablement and on-demand training with Trailhead.com Exposure to executive thought leaders and regular 1:1 coaching with leadership Volunteer opportunities and participation in our 1:1:1 model for giving back to the community For more details, visit https://www.salesforcebenefits.com/ Unleash Your Potential When you join Salesforce, you’ll be limitless in all areas of your life. Our benefits and resources support you to find balance and be your best , and our AI agents accelerate your impact so you can do your best . Together, we’ll bring the power of Agentforce to organizations of all sizes and deliver amazing experiences that customers love. Apply today to not only shape the future — but to redefine what’s possible — for yourself, for AI, and the world. Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form. Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.
Posted 6 days ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within PWC Responsibilities Job Title: Cloud Engineer (Java 17+, Spring Boot, Microservices, AWS) Job Type: Full-Time Job Overview: As a Cloud Engineer, you will be responsible for developing, deploying, and managing cloud-based applications and services on AWS. You will use your expertise in Java 17+, Spring Boot, and Microservices to build robust and scalable cloud solutions. This role will involve working closely with development teams to ensure seamless cloud integration, optimizing cloud resources, and leveraging AWS tools to ensure high availability, security, and performance. Key Responsibilities: Cloud Infrastructure: Design, build, and deploy cloud-native applications on AWS, utilizing services such as EC2, S3, Lambda, RDS, EKS, API Gateway, and CloudFormation. Backend Development: Develop and maintain backend services and microservices using Java 17+ and Spring Boot, ensuring they are optimized for the cloud environment. Microservices Architecture: Architect and implement microservices-based solutions that are scalable, secure, and resilient, ensuring they align with AWS best practices. CI/CD Pipelines: Set up and manage automated CI/CD pipelines using tools like Jenkins, GitLab CI, or AWS CodePipeline for continuous integration and deployment. AWS Services Integration: Integrate AWS services such as DynamoDB, SQS, SNS, CloudWatch, and Elastic Load Balancing into microservices to improve performance and scalability. Performance Optimization: Monitor and optimize the performance of cloud infrastructure and services, ensuring efficient resource utilization and cost management in AWS. Security: Implement security best practices in cloud applications and services, including IAM roles, VPC configuration, encryption, and authentication mechanisms. Troubleshooting & Support: Provide ongoing support and troubleshooting for cloud-based applications, ensuring uptime, availability, and optimal performance. Collaboration: Work closely with cross-functional teams, including frontend developers, system administrators, and DevOps engineers, to ensure end-to-end solution delivery. Documentation: Document the architecture, implementation, and operations of cloud infrastructure and applications to ensure knowledge sharing and compliance. Required Skills & Qualifications: Strong experience with Java 17+ (latest version) and Spring Boot for backend development. Hands-on experience with AWS Cloud services such as EC2, S3, Lambda, RDS, EKS, API Gateway, DynamoDB, SQS, SNS, and CloudWatch. Proven experience in designing and implementing microservices architectures. Solid understanding of cloud security practices, including IAM, VPC, encryption, and secure cloud-native application development. Experience with CI/CD tools and practices (e.g., Jenkins, GitLab CI, AWS CodePipeline). Familiarity with containerization technologies like Docker, and orchestration tools like Kubernetes. Ability to optimize cloud applications for performance, scalability, and cost-efficiency. Experience with monitoring and logging tools like CloudWatch, ELK Stack, or other AWS-native tools. Knowledge of RESTful APIs and API Gateway for exposing microservices. Solid understanding of version control systems like Git and familiarity with Agile methodologies. Strong problem-solving and troubleshooting skills, with the ability to work in a fast-paced environment. Preferred Skills: AWS certifications, such as AWS Certified Solutions Architect or AWS Certified Developer. Experience with Terraform or AWS CloudFormation for infrastructure as code. Familiarity with Kubernetes and EKS for container orchestration in the cloud. Experience with serverless architectures using AWS Lambda. Knowledge of message queues (e.g., SQS, Kafka) and event-driven architectures. Education & Experience: Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent practical experience. 7-11 years of experience in software development with a focus on AWS cloud and microservices. Mandatory Skill Sets Cloud Engineer (Java+Springboot+ AWS) Preferred Skill Sets Cloud Engineer (Java+Springboot+ AWS) Years Of Experience Required 7-11 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Cloud Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 6 days ago
175.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? In this role, the person will report to the Product Manager – Travel & Lifestyle Services, this role is an exciting opportunity for a PO/Analyst, the person will be working on data related products and to maintain quality of data for TLS in the Big Data Platform Cornerstone. Minimum Qualifications 5+ years’ experience in travel domain or minimum background in financial domain At least 5 years of experience in technology product management or data-related products. At least 5 years of experience in Software Architecture and Software Development. 3 years’ experience with SQL Experience with agile methodologies, i.e., rally, agile. An ability to solve complex problems and a highly analytical approach. Demonstrate the ability to learn and be curious to understand and master the travel domain. You are excited and passionate for the travel domain. Self-starter with the ability to think creatively and strategically Strong communication and stakeholder management skills Excellent communication skills with the ability to engage, influence, and inspire partners to drive collaboration and alignment. Demonstrate the ability to maintain a positive attitude and sense of humor in the face of chaos and challenges Has a successful record of leading and coordinating business, delivery, and technology teams to define, prioritize, and deliver on a product roadmap Strong product management skills that will take full ownership from analysis through implementation. High degree of organization, individual initiative, and personal accountability. Platform Knowledge Experience working w/ Hadoop and Big Data Platform – Cornerstone, Google Cloud Platform (GCP) Proficient in Microsoft Suit, Power BI, Tableau, and SQL Education Bachelors in related fields (Computer Science, Information Technology, Engineer, Electronics) Preferred Qualifications Masters in related in fields (Computer Science, Information Technology, Engineer, Electronics) We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 6 days ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description: Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 6 days ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a business application consulting generalist at PwC, you will provide consulting services for a wide range of business applications. You will leverage a broad understanding of various software solutions to assist clients in optimising operational efficiency through analysis, implementation, training, and support. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. Responsibilities Design, develop, and maintain scalable Java applications using Spring Boot and related technologies. - Integrate various analytics services (e.g., Google Analytics, Power BI, Tableau, etc.) into platforms and applications. - Collaborate with cross-functional teams to gather requirements and deliver technical solutions that align with business goals. - Build and enhance products and platforms that support analytics capabilities, ensuring high performance and scalability. - Write efficient, clean, and well-documented code that adheres to best practices. - Develop and integrate RESTful APIs and microservices to support real-time data processing and analytics. - Ensure continuous improvement by actively participating in code reviews and following best practices in development. - Troubleshoot, debug, and resolve application issues and bugs. - Collaborate with DevOps teams to ensure proper deployment and performance of analytics platforms in production environments. - Stay updated with the latest industry trends and advancements in Java, Spring Boot, and analytics tools. ### **Required Qualifications: ** - Experience in Java development, with a strong emphasis on Spring Boot. - Proven experience integrating analytics services (e.g., Google Analytics, Power BI, Tableau) into applications and platforms. - Hands-on experience in building and optimizing products or platforms for analytics and data processing. - Strong understanding of microservices architecture, RESTful APIs, and cloud-based deployment (e.g., AWS,Azure). Proficiency with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases. - Solid understanding of object-oriented programming, design patterns, and software architecture principles. - Experience with version control tools like Git. - Excellent problem-solving and debugging skills. - Strong communication skills, with the ability to work in a collaborative, fast-paced environment. ### **Preferred Qualifications:** - Experience with front-end technologies like JavaScript, React, or Angular is a plus. - Knowledge of DevOps practices, CI/CD pipelines, and containerization tools (e.g., Docker, Kubernetes). - Familiarity with big data tools and technologies such as Apache Kafka, Hadoop, or Spark. - Experience working in an agile Mandatory skill sets: Java, Spring boot, Kotlin Preferred skill sets: Java, Spring boot, Kotlin Years of experience required: 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Go Programming Language Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment, Performance Management Software {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 6 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities Job Description: Job Summary: We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. You’ll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering. Mandatory Skill Sets Databricks Preferred Skill Sets Databricks Years Of Experience Required 7-14 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date August 11, 2025
Posted 6 days ago
6.0 years
0 Lacs
India
Remote
Job Title: Data Scientist – Demand Forecasting Location: Remote Experience Required: 6+ Years About the Role MindBrain Innovations Pvt Ltd is seeking a skilled and driven Data Scientist – Demand Forecasting to join our advanced analytics team. This role is pivotal in developing accurate and scalable demand forecasting solutions to guide key business decisions across inventory planning, staffing, and financial forecasting. The ideal candidate will combine technical proficiency in machine learning and time-series forecasting with strong communication and analytical problem-solving skills. You’ll collaborate with cross-functional teams to implement data science solutions that directly impact strategic planning and operations. Key Responsibilities Develop and enhance time-series forecasting models using Python and SQL. Work with business stakeholders and software engineers to improve demand planning accuracy. Design, run, and analyze experiments to test improvements to current algorithms and forecasting strategies. Discover and integrate new data sources to improve model robustness and relevance. Translate complex model behavior into actionable business insights and clearly communicate underlying assumptions and limitations. Build and deploy scalable data pipelines that connect model development to production systems. Participate in cross-functional collaboration to ensure business users understand and trust forecast outputs. Stay current with industry trends and best practices to continually refine forecasting methodologies. Required Qualifications Master’s degree in Data Science, Statistics, Applied Mathematics, Computer Science, Engineering, Physics, or a related quantitative discipline. 5+ years of experience in data science, analytics, or a related field focused on statistical modeling and data extraction. Advanced programming skills in Python and strong proficiency in SQL . Experience with large-scale data processing tools (e.g., Hadoop, Hive, Scala). Deep knowledge of time-series forecasting techniques , multivariate algorithms, and model validation methods. Proficient in feature engineering , model tuning , and hyperparameter optimization . Ability to write clean, production-ready, and well-documented code. Strong communication skills to convey technical insights clearly to business and engineering teams. Experience building automated and production-grade data pipelines. Ability to work both independently and in a collaborative team environment. Preferred Qualifications Experience working in supply chain or demand planning environments. Familiarity with data visualization tools and dashboarding. Knowledge of the Azure cloud platform and its data services. Hands-on experience in building and deploying APIs for model integration. Prior involvement in stakeholder engagement and influencing decision-making without direct authority.
Posted 6 days ago
0.0 years
0 - 0 Lacs
Thiruvananthapuram, Kerala
On-site
Data Science and AI Developer **Job Description:** We are seeking a highly skilled and motivated Data Science and AI Developer to join our dynamic team. As a Data Science and AI Developer, you will be responsible for leveraging cutting-edge technologies to develop innovative solutions that drive business insights and enhance decision-making processes. **Key Responsibilities:** 1. Develop and deploy machine learning models for predictive analytics, classification, clustering, and anomaly detection. 2. Design and implement algorithms for data mining, pattern recognition, and natural language processing. 3. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. 4. Utilize advanced statistical techniques to analyze complex datasets and extract actionable insights. 5. Implement scalable data pipelines for data ingestion, preprocessing, feature engineering, and model training. 6. Stay updated with the latest advancements in data science, machine learning, and artificial intelligence research. 7. Optimize model performance and scalability through experimentation and iteration. 8. Communicate findings and results to stakeholders through reports, presentations, and visualizations. 9. Ensure compliance with data privacy regulations and best practices in data handling and security. 10. Mentor junior team members and provide technical guidance and support. **Requirements:** 1. Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. 2. Proven experience in developing and deploying machine learning models in production environments. 3. Proficiency in programming languages such as Python, R, or Scala, with strong software engineering skills. 4. Hands-on experience with machine learning libraries/frameworks such as TensorFlow, PyTorch, Scikit-learn, or Spark MLlib. 5. Solid understanding of data structures, algorithms, and computer science fundamentals. 6. Excellent problem-solving skills and the ability to think creatively to overcome challenges. 7. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. 8. Certification in Data Science, Machine Learning, or Artificial Intelligence (e.g., Coursera, edX, Udacity, etc.). 9. Experience with cloud platforms such as AWS, Azure, or Google Cloud is a plus. 10. Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) is an advantage. Data Manipulation and Analysis : NumPy, Pandas Data Visualization : Matplotlib, Seaborn, Power BI Machine Learning Libraries : Scikit-learn, TensorFlow, Keras Statistical Analysis : SciPy Web Scrapping : Scrapy IDE : PyCharm, Google Colab HTML/CSS/JavaScript/React JS Proficiency in these core web development technologies is a must. Python Django Expertise: In-depth knowledge of e-commerce functionalities or deep Python Django knowledge. Theming: Proven experience in designing and implementing custom themes for Python websites. Responsive Design: Strong understanding of responsive design principles and the ability to create visually appealing and user-friendly interfaces for various devices. Problem Solving: Excellent problem-solving skills with the ability to troubleshoot and resolve issues independently. Collaboration: Ability to work closely with cross-functional teams, including marketing and design, to bring creative visions to life. interns must know about how to connect front end with datascience Also must Know to connect datascience to frontend **Benefits:** - Competitive salary package - Flexible working hours - Opportunities for career growth and professional development - Dynamic and innovative work environment Job Type: Full-time Pay: ₹8,000.00 - ₹12,000.00 per month Ability to commute/relocate: Thiruvananthapuram, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person
Posted 6 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team: The Data Foundations team plays a critical role in supporting Roku Ads business intelligence and analytics . The team is responsible for developing and managing foundational datasets designed to serve the operational and analytical needs of the broader organization. The team's mission is carried out through three focus areas: acting as the interface between data producers and consumers, simplifying data architecture, and creating tools in a standardized way . About the Role: We are seeking a talented and experienced Senior Software Engineer with a strong background in big data technologies, including Apache Spark and Apache Airflow. This hybrid role bridges software and data engineering, requiring expertise in designing, building, and maintaining scalable systems for both application development and data processing. You will collaborate with cross-functional teams to design and manage robust, production-grade, large-scale data systems. The ideal candidate is a proactive self-starter with a deep understanding of high-scale data services and a commitment to excellence. What you’ll be doing Software Development: Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews. Big Data Engineering: Design, develop, and maintain data pipelines and ETL workflows using Apache Spark, Apache Airflow. Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance. Develop and fine-tune complex queries and data processing jobs for large-scale datasets. Monitor, troubleshoot, and improve data systems for minimal downtime and maximum efficiency. Collaboration & Mentorship: Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions. Provide technical guidance and mentorship to junior engineers, promoting best practices in data engineering. We’re excited if you have Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in software and/or data engineering with expertise in big data technologies such as Apache Spark, Apache Airflow and Trino. Strong understanding of SOLID principles and distributed systems architecture. Proven experience in distributed data processing, data warehousing, and real-time data pipelines. Advanced SQL skills, with expertise in query optimization for large datasets. Exceptional problem-solving abilities and the capacity to work independently or collaboratively. Excellent verbal and written communication skills. Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes. (preferred) Familiarity with additional big data technologies, including Hadoop, Kafka, and Presto. (preferred) Strong programming skills in Python, Java, or Scala. (preferred) Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform). (preferred) Expertise in data modeling, schema design, and data visualization tools. (preferred) AI literacy and curiosity.You have either tried Gen AI in your previous work or outside of work or are curious about Gen AI and have explored it. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.
Posted 6 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Job Description About Us Masco Home Products India (MHPI) is a fully owned subsidiary of Masco Corporation, headquartered in Livonia, MI. The vision of MHPI is to be recognized as a world-class Global Business Services organization driven by the desire for excellence in its people, business solutions, execution, and partnerships with internal customers to develop “Lean and Simple” business solutions. Headquartered in Livonia, Michigan, Masco Corporation is a global leader in the design, manufacture and distribution of branded home improvement and building products. Our portfolio of industry-leading brands includes Behr® paint; Delta® and hansgrohe® faucets, bath and shower fixtures; Liberty® branded decorative and functional hardware; and HotSpring® spas. We leverage our powerful brands across product categories, sales channels and geographies to create value for our customers and shareholders. For more information about Masco Corporation, visit www.masco.com Masco Home Products India (MHPI) MHPI (Masco Home Products India) | LinkedIn Business Unit Supported: MASCO CANADA Website: - https://www.mascocanada.com/ Position: BI- Data Engineer Location: India (Permanent Remote) Job Type: Permanent Experience required: Minimum of 5 years in data engineering, database design and ETL development Assessment Test: (based on role requirement) will be required to take a skill assessment test prior interview. Shift: 04:00 pm - 01:00 am IST (min 4.5 hrs EST overlapping) Work hours- total 9 hrs (8.5 hrs working+ 30 minutes break) CTC: As per market standards Notice period: (Immediate joiner preferred) candidate serving notice and left with 30 working days notice period preferred. Must have Broadband availability: Min 30 MBPS (national service provider JIO/TATA/Airtel Hathway etc) Important: Access to a quiet home office environment with above mentioned broadband availability and working space to accommodate two monitors +1 Laptop. (based on role requirement) Job Summary: Reports To: Senior Manager, Analytics the Data Engineer will play a crucial role in the design, development, and maintenance of the organization’s data architecture. The primary focus will be on constructing, testing, and maintaining scalable data architecture and infrastructure to meet the business’ growing data needs. The Data Engineer shall also implement methods to improve data quality and reliability while ensuring high levels of availability. The Data Engineer shall also develop test architectures that enable data extraction and transformation for predictive/prescriptive modelling. This role requires a strong foundation in data engineering, database management, and proficiency in various programming languages and data technologies. Primary Responsibilities: Collaborate with cross-functional teams to understand data requirements, define data standards, and design efficient data models and architectures. Develop and implement strategies for data acquisition, transformation, and storage. Evaluate, recommend, and select data warehouse components, including hardware, database management systems, ETL software and data mining tools. Coordinate and work with other IT staff to develop database architectures, coding standards and quality assurance policies and procedures. Build and optimize large-scale data processing sytems for batch and real-time data pipelines. Implement data integration solutions to collect and combine data from various sources. Design and manage databases, ensuring performance, security, and scalability. Conduct regular database maintenance, backups, and updates. Design and implement redundant systems, policies and procedures for disaster recovery and data archiving to ensure availability, protection, and integrity of data assets. Develop and maintain ETL (extract, transform, load) processes to ensure smooth flow of data from source to destination. Troubleshoot and optimize ETL workflows for efficiency. Implement data quality checks and ensure data integrity throughout the data lifecycle. Enforce data governance policies and best practices. Work closely with data analysts, data governance and other stakeholders to understand their data requirements and provide support. Collaborate with IT and business teams to integrate data engineering solutions into existing systems. Identify and resolve Performance bottlenecks in data pipelines and databases. Optimize queries and processes for improved efficiency. Maintain comprehensive documentation for data processes, workflows, and systems. Provide training and support to other team members as required. Essential Skills: Excellent English communication skills, both verbal and written. Proficient with Microsoft applications and computer skills Strong understanding of business processes and data flows. Strong interpersonal, communication and collaboration skills. Excellent problem solving, troubleshooting and analytical skills. Demonstrated successful ability to organize and prioritize work to ensure timely deadlines. Thrives in a team-oriented environment while capable of working autonomously. Strong attention to detail and ability to navigate ambiguous situations. Good time management and ability to manage multiple concurrent projects/tasks within time constraints. Must Have “Technical” Skills: Minimum of 5 years in data engineering, database design and ETL development Hands on experience with data architecting, data mining, large scale data modeling and business requirements gathering/analysis. Proficiency in programming languages such as Python, Java or SQL. Strong knowledge of database management systems (i.e., SQL, NoSQL). Familiarity with big data technologies (i.e., Hadoop, Spark, Kafka) and cloud platforms (i.e., AWS, Azure, GCP). Understanding of data modeling and design principles. Advanced data manipulation skills: read in data, process and clean it, transform and recode it, merge different data sets, reformat data between wide and long, etc. Technical expertise in data models, data mining, and segmentation techniques. Experience with data processing flowcharting techniques Preferred Skills: Power BI visualization experience Experience with data warehousing solutions. Knowledge of data security, compliance and applicable data privacy practices and laws. Understanding of machine learning concepts and data analytics. Familiarity with ERP systems and integrations (JDE) a plus. Experience Power BI a plus. Experience with ServiceNow ticketing system a plus. Experience in Microsoft platforms, Data Bricks and Azure Data Factor preferred. Education: Bachelor’s degree in Information Technology, Computer Science, or related field Disclaimer: It has come to our attention that there have recently been some employment scams that have utilized reputable companies’ names, including ours, to solicit personal information as part of a fraudulent hiring scam. Please note that all of our open positions are posted at https://jobs.masco.com or https://www.linkedin.com/company/masco-home-products-private-limited and any role not posted there is not a role we have open. If you are seeking a position at Masco Home Products Private Limited (MHPI), we recommend that you write to us on Careers.MHPI@masco.com if you have any questions about our hiring process, need to verify a Masco Home Products Private Limited (MHPI) job posting or offer or need to speak with a MHPI representative directly. Company Masco Home Products India Full time Masco Home Products India (the “Company”) is an equal opportunity employer and we want to have the best available persons in every job. The Company makes employment decisions only based on merit. It is the Company’s policy to prohibit discrimination in any employment opportunity (including but not limited to recruitment, employment, promotion, salary increases, benefits, termination and all other terms and conditions of employment) based on race, color, sex, sexual orientation, gender, gender identity, gender expression, genetic information, pregnancy, religious creed, national origin, ancestry, age, physical/mental disability, medical condition, marital/domestic partner status, military and veteran status, height, weight or any other such characteristic protected by federal, state or local law. The Company is committed to complying with all applicable laws providing equal employment opportunities. This commitment applies to all persons involved in the operations of the Company regardless of where the employee is located and prohibits unlawful discrimination by any employee of the Company. Masco Corporation is an E-Verify employer. E-Verify is an Internet based system operated by the Department of Homeland Security (DHS) in partnership with the Social Security Administration (SSA) that allows participating employers to electronically verify the employment eligibility of their newly hired employees in the United States. Please click on the following links for more information. E-Verify Participation Poster: English & Spanish E-verify Right to Work Poster: English, Spanish
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France