Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
26.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
CES has 26+ years of experience in delivering Software Product Development, Quality Engineering, and Digital Transformation Consulting Services to Global SMEs & Large Enterprises. CES has been delivering services to some of the leading Fortune 500 Companies including Automotive, AgTech, Bio Science, EdTech, FinTech, Manufacturing, Online Retailers, and Investment Banks. These are long-term relationships of more than 10 years and are nurtured by not only our commitment to timely delivery of quality services but also due to our investments and innovations in their technology roadmap. As an organization, we are in an exponential growth phase with a consistent focus on continuous improvement, process-oriented culture, and a true partnership mindset with our customers. We are looking for the right qualified and committed individuals to play an exceptional role as well as to support our accelerated growth. You can learn more about us at: http://www.cesltd.com/ Job Description Experience with Azure Synapse Analytics: Hands-on experience in designing, developing, and deploying solutions using Azure Synapse Analytics, including familiarity with its various components such as SQL pools, Spark pools, and Integration Runtimes. Expertise in Azure Data Lake Storage: In-depth understanding of Azure Data Lake Storage, including its architecture, features, and best practices for managing a large-scale Data Lake or Lakehouse in an Azure environment. Experience with AI Tools: Experience with AI Tools and LLMs (e.g. GitHub Copilot, Copilot, ChatGPT) in automating many of the responsibilities outlined for this role. Knowledge of Avro and Parquet: Experience working with Avro and Parquet file formats, including data serialization, compression techniques, and schema evolution. Understanding of their advantages and use cases in a big data environment. Healthcare: Prior experience working with data in a healthcare or clinical laboratory environment and a strong understanding of PHI, GDPR & HIPPA/HITRUST is highly desirable. Certifications: Relevant certifications such as Azure Data Engineer Associate or Azure Synapse Analytics Developer Associate are highly desirable. Essential Functions Data Integration and ELT Development: Design, develop, and maintain data pipelines for ingestion, transformation, and loading of data into Azure Synapse Analytics. This includes understanding functional and non-functional requirements, performing source data analysis, data profiling, and implementing efficient ELT processes. Azure Synapse Development: Work with Azure Synapse Analytics to build and optimize data models, SQL queries, stored procedures, and other artifacts necessary for data processing and analysis. Data Lake File Handling: Understand the characteristics of various file formats, optimizing data storage, and implementing efficient data reading and writing mechanisms for incremental updates within Azure Synapse Analytics. Data Governance and Security: Ensure compliance with data governance policies and implement security measures to protect sensitive data stored in Azure. This involves encryption, masking, and access control mechanisms. Performance Optimization: Continuously optimize data pipelines and storage configurations to improve performance, scalability, and reliability. This includes identifying bottlenecks, query tuning, and leveraging Azure Synapse Analytics features for parallel processing. Monitoring and Troubleshooting: Implement monitoring solutions to track data pipeline performance, data quality, and system health. Troubleshoot issues related to data ingestion, transformation, or storage, and provide timely resolutions. Skills Needed to be Successful Relational Database Experience: Proficiency with one or more of the following database platforms; e.g. Oracle, Microsoft SQL Server, PostgreSQL, MySQL/MariaDB Proficiency in SQL: Strong SQL skills, including experience with complex SQL queries, stored procedures, and performance optimization techniques. Familiarity with T-SQL for Azure Synapse Analytics is a plus. ELT and Data Integration Skills: Proven experience in building ELT pipelines and data integration solutions using tools like Azure Data Factory, Oracle Golden Gate, or similar platforms. Ability to handle a variety of legacy data sources and file formats efficiently. Data Modeling and Warehousing Concepts: Familiarity with dimensional modeling, star schemas, and data warehousing principles. Experience in designing and implementing data models for analytical workloads. Analytical and Problem-Solving Abilities: Strong analytical skills with the ability to understand complex data requirements, troubleshoot technical issues, and propose effective solutions to meet business needs. Communication and Collaboration: Excellent communication skills with the ability to collaborate effectively with cross-functional teams, including Data Scientists, Reporting Analysts, and DevOps professionals. Show more Show less
Posted 5 days ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Reporting to the A/NZ DSE Chapter Manager India PEC within Decision Sciences & Engineering, this role will own and be responsible for the data & analytic engineering chapter in India PEC. The Data Engineer is an essential part of the business that enables the team to support the ongoing acquisition and internal purposing of data, through to the fulfilment of products, insights and systems. As a Data Engineer, you will be responsible for working with our internal customers to ensure that data and systems are being designed and built to move and manipulate data in a scalable, reusable and efficient manner to suit the environment, project, security and requirements. What You’ll Do Design, architect, and implement scalable and secure data pipelines on GCP, utilizing services like Dataflow, Pub/Sub, and Cloud Storage. Develop and maintain data models, ensuring data quality, consistency, and accessibility for various internal stakeholders. Automate data processes and workflows using scripting languages like Python, leveraging technologies like Spark and Airflow. Monitor and troubleshoot data pipelines, identifying and resolving performance issues proactively. Stay up-to-date with the latest trends and advancements in GCP and related technologies, actively proposing and evaluating new solutions. Implement data governance best practices, including data security, access control, and lineage tracking. Lead security initiatives, design and implement security architecture. Lead data quality initiatives, design and implement monitoring dashboards. Mentor and guide junior data engineers, sharing knowledge and best practices to foster a high-performing team. Role requires a solid educational foundation and the ability to develop a strategic vision and roadmap for D&A’s transition to the cloud while balancing delivery of near-term results that are aligned with execution. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 8+ years of experience as a data engineer or related role, with experience demonstrating leadership capabilities Cloud certification strongly preferred Expert level skills using programming languages such as Python or SQL (Big Query) and advanced level experience with scripting languages. Demonstrated proficiency in all Google Cloud Services Experience building and maintaining complex data pipelines, troubleshooting complex issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects; Proficiency in Airflow strongly desired Experience designing and implementing advanced to complex data models and experience enabling advanced optimization to improve performance Experience leading a team with Git expertise strongly preferred Hands on Experience on Agile Methodoligies Working Knowledge of CI/CD What could set you apart: Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities A well-balanced view of resource management, thinking creatively and effectively to deploy the team whilst building skills for the future Skilled in internal networking, negotiating and proactively developing individuals and teams to be the best they can be Strong communicator & presenter, bringing everyone on the journey. Knowledge of Big Data technology and tools with the ability to share ideas among a collaborative team and drive the team based on technical expertise and learning, sharing best practices Excellent communication skills to engage with senior management, internal customers and product management Sound understanding of regulations and security requirements governing access to data in Big Data systems Sound understanding of Insight delivery systems for batch and online Should be able to run Agile Scrum-based projects Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Show more Show less
Posted 5 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About the Role: As a Lead Data Scientist, you will independently execute specialized data tasks, focusing on model development, data interpretation, and driving forward the data science agenda. You will leverage advanced algorithms and statistical techniques to unearth insights and guide business decisions. This position is suited for self-driven professionals who excel at transforming raw data into strategic assets and are ready to contribute significantly to data science projects. Responsibilities: Lead the development and deployment of advanced machine learning models. Perform in-depth data analysis to identify actionable insights. Develop and maintain complex data processing pipelines. Collaborate with stakeholders to align data science initiatives with business goals. Drive feature engineering and selection processes. Design and implement scalable data solutions for analytics. Conduct exploratory data analysis to explore new opportunities. Ensure the robustness and reliability of data science projects. Provide guidance on best practices for data science workflows. Stay ahead of trends and continuously improve technical skills and knowledge. Skills: Advanced Statistical Methods: Proficient in applying complex statistical techniques. Machine Learning Expertise: In-depth knowledge of machine learning algorithms and their practical applications. Python/R/SAS: Advanced skills in Python, with knowledge in R or SAS for data analysis. Big Data Technologies: Familiarity with big data tools like Spark and Hadoop. Data Engineering: Proficiency in building and managing data pipelines. Predictive Modeling: Expertise in developing and fine-tuning predictive models. Communication: Excellent ability to translate data insights into business strategies. Project Management: Strong project management skills to oversee data initiatives. Applicants may be required to appear onsite at a Wolters Kluwer office as part of the recruitment process. Show more Show less
Posted 5 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Amazon Music is awash in data! To help make sense of it all, the DISCO (Data, Insights, Science & Optimization) team: (i) enables the Consumer Product Tech org make data driven decisions that improve the customer retention, engagement and experience on Amazon Music. We build and maintain automated self-service data solutions, data science models and deep dive difficult questions that provide actionable insights. We also enable measurement, personalization and experimentation by operating key data programs ranging from attribution pipelines, northstar weblabs metrics to causal frameworks. (ii) delivering exceptional Analytics & Science infrastructure for DISCO teams, fostering a data-driven approach to insights and decision making. As platform builders, we are committed to constructing flexible, reliable, and scalable solutions to empower our customers. (iii) accelerates and facilitates content analytics and provides independence to generate valuable insights in a fast, agile, and accurate way. This domain provides analytical support for the below topics within Amazon Music: Programming / Label Relations / PR / Stations / Livesports / Originals / Case & CAM. DISCO team enables repeatable, easy, in depth analysis of music customer behaviors. We reduce the cost in time and effort of analysis, data set building, model building, and user segmentation. Our goal is to empower all teams at Amazon Music to make data driven decisions and effectively measure their results by providing high quality, high availability data, and democratized data access through self-service tools. If you love the challenges that come with big data then this role is for you. We collect billions of events a day, manage petabyte scale data on Redshift and S3, and develop data pipelines using Spark/Scala EMR, SQL based ETL, Airflow and Java services. We are looking for talented, enthusiastic, and detail-oriented Data Engineer, who knows how to take on big data challenges in an agile way. Duties include big data design and analysis, data modeling, and development, deployment, and operations of big data pipelines. You'll help build Amazon Music's most important data pipelines and data sets, and expand self-service data knowledge and capabilities through an Amazon Music data university. DISCO team develops data specifically for a set of key business domains like personalization and marketing and provides and protects a robust self-service core data experience for all internal customers. We deal in AWS technologies like Redshift, S3, EMR, EC2, DynamoDB, Kinesis Firehose, and Lambda. Your team will manage the data exchange store (Data Lake) and EMR/Spark processing layer using Airflow as orchestrator. You'll build our data university and partner with Product, Marketing, BI, and ML teams to build new behavioural events, pipelines, datasets, models, and reporting to support their initiatives. You'll also continue to develop big data pipelines. Key job responsibilities Deep understanding of data, analytical techniques, and how to connect insights to the business, and you have practical experience in insisting on highest standards on operations in ETL and big data pipelines. With our Amazon Music Unlimited and Prime Music services, and our top music provider spot on the Alexa platform, providing high quality, high availability data to our internal customers is critical to our customer experiences. Assist the DISCO team with management of our existing environment that consists of Redshift and SQL based pipelines. The activities around these systems will be well defined via standard operation procedures (SOP) and typically involve approving data access requests, subscribing or adding new data to the environment SQL data pipeline management (creating or updating existing pipelines) Perform maintenance tasks on the Redshift cluster. Assist the team with the management of our next-generation AWS infrastructure. Tasks includes infrastructure monitoring via CloudWatch alarms, infrastructure maintenance through code changes or enhancements, and troubleshooting/root cause analysis infrastructure issues that arise, and in some cases this resource may also be asked to submit code changes based on infrastructure issues that arise. About The Team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators.From personalized music playlists to exclusive podcasts,concert livestreams to artist merch,we are innovating at some of the most exciting intersections of music and culture.We offer experiences that serve all listeners with our different tiers of service:Prime members get access to all music in shuffle mode,and top ad-free podcasts,included with their membership;customers can upgrade to Music Unlimited for unlimited on-demand access to 100 million songs including millions in HD,Ultra HD,spatial audio and anyone can listen for free by downloading Amazon Music app or via Alexa-enabled devices.Join us for opportunity to influence how Amazon Music engages fans, artists,and creators on a global scale. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience with one or more scripting language (e.g., Python, KornShell) Experience in Unix Experience in Troubleshooting the issues related to Data and Infrastructure issues. Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of distributed systems as it pertains to data storage and computing Experience in building or administering reporting/analytics platforms Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2838395 Show more Show less
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview The Senior Data Science Engineer will leverage advanced data science techniques to solve complex business problems, guide decision-making processes, and mentor junior team members. This role requires a combination of technical expertise in data analysis, machine learning, and project management skills. Responsibilities Data Analysis and Modeling: Analyze large-scale telecom datasets to extract actionable insights and build predictive models for network optimization and customer retention. Conduct statistical analyses to validate models and ensure their effectiveness. Machine Learning Development: Design and implement machine learning algorithms for fraud detection, churn prediction, and network failure analysis. Telecom-Specific Analytics: Apply domain knowledge to improve customer experience by analyzing usage patterns, optimizing services, and predicting customer lifetime value. ETL Processes: Develop robust pipelines for extracting, transforming, and loading telecom data from diverse sources. Collaboration: Work closely with data scientists, software engineers, and telecom experts to deploy solutions that enhance operational efficiency. Data Governance : Ensure data integrity, privacy, security and compliance with industry standards Requirements Advanced degree in Data Science, Statistics, Computer Science, or a related field. Extensive experience in data science roles with a strong focus on machine learning and statistical modeling. Proficiency in programming languages such as Python or R and strong SQL skills. Familiarity with big data technologies (e.g., Hadoop, Spark) is advantageous. Expertise in cloud platforms such as AWS or Azure. Show more Show less
Posted 5 days ago
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Oracle Customer Success Services Building on the mindset that "Who knows Oracle …. better than Oracle?" Oracle Customer Success Services assists customers with their requirements for some of the most cutting-edge applications and solutions by utilizing the strengths of more than two decades of expertise in developing mission-critical solutions for enterprise customers and combining it with cutting-edge technology to provide our customers' speed, flexibility, resiliency, and security to enable customers to optimize their investment, minimize risk, and achieve more. The business was established with an entrepreneurial mindset and supports a vibrant, imaginative, and highly varied workplace. We are free of obligations, so we'll need your help to turn it into a premier engineering hub that prioritizes quality. Why? Oracle Customer Success Services Engineering is responsible for designing, building, and managing cutting-edge solutions, services, and core platforms to support the managed cloud business including but not limited to Oracle Cloud Infrastructure (OCI), Oracle Cloud Applications (SaaS) & Oracle Enterprise Applications. This position is for CSS Architecture Team, and we are searching for the finest and brightest technologists as we begin on the road of cloud-native digital transformation. We operate under a garage culture, rely on cutting-edge technology in our daily work, and provide a highly innovative, creative, and experimental work environment. We prefer to innovate and move quickly, putting a strong emphasis on scalability and robustness. We need your assistance to build a top-tier engineering team that has a significant influence. What? As a Principal Data Science & AIML Engineer within the CSS CDO Architecture & Platform team, you’ll lead efforts in designing and building scalable, distributed, resilient services that provide artificial intelligence and machine learning capabilities on OCI & Oracle Cloud Applications for the business. You will be responsible for the design and development of machine learning systems and applications, ensuring they meet the needs of our clients and align with the company's strategic objectives. The ideal candidate will have extensive experience in machine learning algorithms, model creation and evaluation, data engineering and data processing for large scale distributed systems, and software development methodologies. We strongly believe in ownership and challenging the status quo. We expect you to bring critical thinking and long-term design impact while building solutions and products defining system integrations, and cross-cutting concerns. Being part of the architecture function also provides you with the unique ability to enforce new processes and design patterns that will be future-proof while building new services or products. As a thought leader, you will own and lead the complete SDLC from Architecture Design, Development, Test, Operational Readiness, and Platform SRE. Responsibilities As a member of the architecture team, you will be in charge of designing software products, services, and platforms, as well as creating, testing, and managing the systems and applications we create in line with the architecture patterns and standards. As a core member of the Architecture Chapter, you will be expected to advocate for the adoption of software architecture and design patterns among cross-functional teams both within and outside of engineering roles. You will also be expected to act as a mentor and act in capacity as an advisor to the team(s) within the software and AIML domain. As we push for digital transformation throughout the organization, you will constantly be expected to think creatively and optimize and harmonize business processes. Core Responsibilities Lead the development of machine learning models, integration with full stack software ecosystem, data engineering and contribute to the design strategy. Collaborate with product managers and development teams to identify software requirements and define project scopes. Develop and maintain technical documentation, including architecture diagrams, design specifications, and system diagrams. Analyze and recommend new software technologies and platforms to ensure the company stays ahead of the curve. Work with development teams to ensure software projects are delivered on time, within budget, and to the required quality standards. Provide guidance and mentorship to junior developers. Stay up-to-date with industry trends and developments in software architecture and development practices. Required Qualifications Bachelor's or Master's Degree in Computer Science, Machine Learning/AI, or a closely related field. 6+ years of experience in software development, machine learning, data science, and data engineering design. Proven ability to build and manage enterprise-distributed and/or cloud-native systems. Broad knowledge of cutting-edge machine learning models and strong domain expertise in both traditional and deep learning, particularly in areas such as Recommendation Engines, NLP & Transformers, Computer Vision, and Generative AI. Advanced proficiency in Python and frameworks such as FastAPI, Dapr & Flask or equivalent. Deep experience with ML frameworks such as PyTorch, TensorFlow, and Scikit-learn. Hands-on experience building ML models from scratch, transfer learning, and Retrieval Augmented Generation (RAG) using various techniques (Native, Hybrid, C-RAG, Graph RAG, Agentic RAG, and Multi-Agent RAG). Experience building Agentic Systems with SLMs and LLMs using frameworks like Langgraph + Langchain, AutoGen, LlamaIndex, Bedrock, Vertex, Agent Development Kit, Model Context Protocol (MCP)and Haystack or equivalent. Experience in Data Engineering using data lakehouse stacks such as ETL/ELT, and data processing with Apache Hadoop, Spark, Flink, Beam, and dbt. Experience with Data Warehousing and Lakes such as Apache Iceberg, Hudi, Delta Lake, and cloud-managed solutions like OCI Data Lakehouse. Experience in data visualization and analytics with Apache Superset, Apache Zeppelin, Oracle Analytics Cloud or similar. Hands-on experience working with various data types and storage formats, including NoSQL, SQL, Graph databases, and data serialization formats like Parquet and Arrow. Experience with real-time distributed systems using streaming data with Kafka, NiFi, or Pulsar. Strong expertise in software design concepts, patterns (e.g., 12-Factor Apps), and tools to create CNCF-compliant software with hands-on knowledge of containerization technologies like Docker and Kubernetes. Proven ability to build and deploy software applications on one or more public cloud providers (OCI, AWS, Azure, GCP, or similar). Demonstrated ability to write full-stack applications using polyglot programming with languages/frameworks like FastAPI, Python, and Golang. Experience designing API-first systems with application stacks like FARM and MERN, and technologies such as gRPC and REST. Solid understanding of Design Thinking, Test-Driven Development (TDD), BDD, and end-to-end SDLC. Experience in DevOps practices, including Kubernetes, CI/CD, Blue-Green, and Canary deployments. Experience with Microservice architecture patterns, including API Gateways, Event-Driven & Reactive Architecture, CQRS, and SAGA. Familiarity with OOP design principles (SOLID, DRY, KISS, Common Closure, and Module Encapsulation). Proven ability to design software systems using various design patterns (Creational, Structural, and Behavioral). Strong interpersonal skills and the ability to effectively communicate with business stakeholders. Demonstrated ability to drive technology adoption in AIML Solutions and CNCF software stack. Excellent analytical, problem-solving, communication, and leadership skills. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 5 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
𝗞𝗔𝗬𝗖𝗼𝗻𝗻𝗲𝗰𝘁 𝗟𝗟𝗖 is hiring 𝗕𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 𝗘𝘅𝗲𝗰𝘂𝘁𝗶𝘃𝗲𝘀 – and we want YOU! 💼 𝗪𝗵𝗼 𝗖𝗮𝗻 𝗔𝗽𝗽𝗹𝘆? 📌 Recent Graduates 📌 Excellent communicators 📌 Eager learners with a spark to grow 📍 𝗟𝗼𝗰𝗮𝘁𝗶𝗼𝗻: Sector 63, Noida (Onsite only) 🕒 𝗧𝗶𝗺𝗶𝗻𝗴𝘀: 6:30 PM – 3:30 AM IST (US shift-Night shifts only) 💼 𝗘𝗺𝗽𝗹𝗼𝘆𝗺𝗲𝗻𝘁 𝗧𝘆𝗽𝗲: Full-time 💎 𝗣𝗲𝗿𝗸𝘀 𝗬𝗼𝘂’𝗹𝗹 𝗟𝗼𝘃𝗲: ✅ Fixed weekends off (Saturday–Sunday) ✅ Complimentary meals ✅ Safe cab facility for female employees ✅ Competitive salary + performance-based incentives If you’re driven, enthusiastic, and ready to build a rewarding career, this is your launchpad! 📧 𝗔𝗽𝗽𝗹𝘆 𝗡𝗼𝘄: hariom@kayconnect.net Let’s grow together. 💼🌱 Show more Show less
Posted 5 days ago
10.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
About the Role: We are looking for a highly motivated Business Development Executive (BDE) with not more than 5 years of experience, to join our growing sales and Marketing team. Responsibilities including generate new sales opportunities and maintain relation with clients to drive business growth. This role is ideal for individuals with strong communication skills , persistence , and a passion for sales . Key Responsibilities: Proactively initiate conversations with key decision-makers to spark interest in the company’s offerings and uncover potential opportunities. Conduct targeted outreach through channels like LinkedIn, email, and more to identify and qualify Sales-Qualified Leads (SQLs) and Meeting-Qualified Leads (MQLs) aligned with the Ideal Client Persona. Maintain up-to-date and detailed records of all prospect interactions to ensure a clean and actionable sales pipeline. Support the BDM or sales team with research and prospecting. Build and maintain relationships with prospects through timely and consistent follow-ups, aiming to convert interest into qualified sales opportunities. Requirements: 1-3 years ( not more than 3 years ) of experience in a sales, business development B2B (preferably in Services, SaaS, and tech sales). Strong communication skills. Experience with CRM tools is a plus. Ability to handle rejection and maintain a positive, goal-oriented mindset. Self-motivated, proactive, and eager to learn. Bachelor's degree in Business, Marketing, Communications, or a related field is preferred. Why ViTWO? At ViTWO, we pride ourselves on being a leading CFO service provider with a proven track record of delivering exceptional financial solutions. Our innovative software and AI-driven products empower businesses across the globe to streamline their financial operations and achieve sustainable growth. With strong credentials and trust built from diverse vendors and clients worldwide, ViTWO offers a dynamic and collaborative work environment where innovation thrives. Join us to be part of a team that’s redefining financial excellence on a global scale! Show more Show less
Posted 5 days ago
1.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: Junior Motion Graphics Designer Location: Mumbai, India About the role: Working in true collaboration with our client, we have one goal in mind: ‘to be the leading agency partner for the development of stunning and effective needs based content and digital media campaigns’. We are seeking a creative and motivated Junior Motion Graphics Designer to join our dynamic creative team. This role is ideal for someone passionate about animation and graphic design, eager to learn, and looking to contribute to a diverse range of digital media projects. The Junior Motion Graphics Designer will collaborate with senior designers and other team members to create engaging visual content for various channels. What you will be doing: Assist in creating high-quality motion graphics and animations for a variety of media including web, television, and social media. Support the development of storyboards and design concept sketches. Work closely with the creative team to bring innovative ideas to life, ensuring alignment with project objectives and brand guidelines. Collaborate with other departments such as marketing and production to ensure cohesive delivery of final projects. Assist in managing project timelines and deliverables to ensure timely completion of tasks. Maintain organized project files and documentation for easy access and updates. Receive and incorporate feedback from Senior Motion Graphics Designers and clients to improve design and animation work. Assist in editing and refining motion graphics projects to enhance their visual appeal and effectiveness. Stay updated on the latest trends and techniques in motion graphics and animation to bring fresh ideas to the team. Participate in training and professional development opportunities to enhance skills. What you need to be great in this role: Bachelor’s degree in Graphic Design, Animation, Fine Arts, or a related field. 1-2 years of experience in motion graphics design or related field (internships and freelance work considered). Proficiency in design and animation software such as Adobe After Effects, Adobe Premiere Pro, Adobe Illustrator, and Adobe Photoshop. A strong portfolio showcasing digital animation and motion design skills. Strong attention to detail and ability to work effectively in a fast-paced environment. Good communication skills and a collaborative spirit. Basic understanding of 3D animation software (e.g., Cinema 4D, Blender) is a plus. Passion for and inquisitive about AI and new technologies Understanding and knowledge of AI tools is beneficial, but ability to learn and digest benefits and features of AI tools is critical Req ID: 13127 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle. Show more Show less
Posted 5 days ago
2.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a UI/UX Designer, you’ll lead the conceptualization and implementation of user-centric design strategies, elevating our digital experiences across platforms. Your role encompasses the translation of complex concepts into intuitive, engaging interfaces that enhance user satisfaction. You’ll collaborate closely with cross-functional teams, leveraging your expertise to drive the design vision while ensuring alignment with business objectives. Through innovative design thinking and a deep understanding of user behavior, you’ll craft visually stunning and functional solutions that resonate with our audience. Your contributions will significantly impact our brand identity and user engagement, influencing the overall success of our digital products. You're Good At Driving and contributing to the entire interaction design process, steering the product design from concept to implementation, excelling in rapid ideation (Sketching, White-boarding, etc.), wireframes & prototypes creation, visual design, and overseeing front-end implementation, while adeptly resolving blockers and identifying innovative standards and patterns. Seamless collaboration with Engineering within agile SCRUM or iterative development cycles, ensuring precise alignment of UI requirements while skillfully balancing technical challenges with interaction design best practices. Mastering presentations, both in creation and delivery, adeptly managing stakeholder conversations and relationships, and showcasing deep empathy for clients and users. Embracing a user-centered design approach, leveraging a comprehensive understanding of the visual design spectrum, data visualization, responsive and adaptive interface designs, accessibility standards, and user research methodologies to introduce innovative approaches aligned with industry and technology trends. Establishing and implementing standards, effectively evangelizing interaction design guidance across the organization, and articulating concepts comprehensively at all organizational levels. Demonstrating quick ideation validation, embracing a fail-fast, learn-forward mentality, and showcasing strong collaborative skills adaptable to various software development models and life cycles. What You'll Bring 2 - 4 years of relevant experience in a reputed software consulting/product organization, showcasing a portfolio that highlights successful projects and a strong understanding of design principles. Proficiency in industry-standard design tools (e.g., Adobe Creative Suite, Sketch, Figma) and a deep understanding of usability principles and user-centered design methodologies. Bachelor’s degree in Design, Human-Computer Interaction, or a related field; advanced certifications in UX/UI design are a plus. #BCGXjob Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 5 days ago
6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Oracle Customer Success Services Building on the mindset that "Who knows Oracle …. better than Oracle?" Oracle Customer Success Services assists customers with their requirements for some of the most cutting-edge applications and solutions by utilizing the strengths of more than two decades of expertise in developing mission-critical solutions for enterprise customers and combining it with cutting-edge technology to provide our customers' speed, flexibility, resiliency, and security to enable customers to optimize their investment, minimize risk, and achieve more. The business was established with an entrepreneurial mindset and supports a vibrant, imaginative, and highly varied workplace. We are free of obligations, so we'll need your help to turn it into a premier engineering hub that prioritizes quality. Why? Oracle Customer Success Services Engineering is responsible for designing, building, and managing cutting-edge solutions, services, and core platforms to support the managed cloud business including but not limited to Oracle Cloud Infrastructure (OCI), Oracle Cloud Applications (SaaS) & Oracle Enterprise Applications. This position is for CSS Architecture Team, and we are searching for the finest and brightest technologists as we begin on the road of cloud-native digital transformation. We operate under a garage culture, rely on cutting-edge technology in our daily work, and provide a highly innovative, creative, and experimental work environment. We prefer to innovate and move quickly, putting a strong emphasis on scalability and robustness. We need your assistance to build a top-tier engineering team that has a significant influence. What? As a Principal Data Science & AIML Engineer within the CSS CDO Architecture & Platform team, you’ll lead efforts in designing and building scalable, distributed, resilient services that provide artificial intelligence and machine learning capabilities on OCI & Oracle Cloud Applications for the business. You will be responsible for the design and development of machine learning systems and applications, ensuring they meet the needs of our clients and align with the company's strategic objectives. The ideal candidate will have extensive experience in machine learning algorithms, model creation and evaluation, data engineering and data processing for large scale distributed systems, and software development methodologies. We strongly believe in ownership and challenging the status quo. We expect you to bring critical thinking and long-term design impact while building solutions and products defining system integrations, and cross-cutting concerns. Being part of the architecture function also provides you with the unique ability to enforce new processes and design patterns that will be future-proof while building new services or products. As a thought leader, you will own and lead the complete SDLC from Architecture Design, Development, Test, Operational Readiness, and Platform SRE. Responsibilities As a member of the architecture team, you will be in charge of designing software products, services, and platforms, as well as creating, testing, and managing the systems and applications we create in line with the architecture patterns and standards. As a core member of the Architecture Chapter, you will be expected to advocate for the adoption of software architecture and design patterns among cross-functional teams both within and outside of engineering roles. You will also be expected to act as a mentor and act in capacity as an advisor to the team(s) within the software and AIML domain. As we push for digital transformation throughout the organization, you will constantly be expected to think creatively and optimize and harmonize business processes. Core Responsibilities Lead the development of machine learning models, integration with full stack software ecosystem, data engineering and contribute to the design strategy. Collaborate with product managers and development teams to identify software requirements and define project scopes. Develop and maintain technical documentation, including architecture diagrams, design specifications, and system diagrams. Analyze and recommend new software technologies and platforms to ensure the company stays ahead of the curve. Work with development teams to ensure software projects are delivered on time, within budget, and to the required quality standards. Provide guidance and mentorship to junior developers. Stay up-to-date with industry trends and developments in software architecture and development practices. Required Qualifications Bachelor's or Master's Degree in Computer Science, Machine Learning/AI, or a closely related field. 6+ years of experience in software development, machine learning, data science, and data engineering design. Proven ability to build and manage enterprise-distributed and/or cloud-native systems. Broad knowledge of cutting-edge machine learning models and strong domain expertise in both traditional and deep learning, particularly in areas such as Recommendation Engines, NLP & Transformers, Computer Vision, and Generative AI. Advanced proficiency in Python and frameworks such as FastAPI, Dapr & Flask or equivalent. Deep experience with ML frameworks such as PyTorch, TensorFlow, and Scikit-learn. Hands-on experience building ML models from scratch, transfer learning, and Retrieval Augmented Generation (RAG) using various techniques (Native, Hybrid, C-RAG, Graph RAG, Agentic RAG, and Multi-Agent RAG). Experience building Agentic Systems with SLMs and LLMs using frameworks like Langgraph + Langchain, AutoGen, LlamaIndex, Bedrock, Vertex, Agent Development Kit, Model Context Protocol (MCP)and Haystack or equivalent. Experience in Data Engineering using data lakehouse stacks such as ETL/ELT, and data processing with Apache Hadoop, Spark, Flink, Beam, and dbt. Experience with Data Warehousing and Lakes such as Apache Iceberg, Hudi, Delta Lake, and cloud-managed solutions like OCI Data Lakehouse. Experience in data visualization and analytics with Apache Superset, Apache Zeppelin, Oracle Analytics Cloud or similar. Hands-on experience working with various data types and storage formats, including NoSQL, SQL, Graph databases, and data serialization formats like Parquet and Arrow. Experience with real-time distributed systems using streaming data with Kafka, NiFi, or Pulsar. Strong expertise in software design concepts, patterns (e.g., 12-Factor Apps), and tools to create CNCF-compliant software with hands-on knowledge of containerization technologies like Docker and Kubernetes. Proven ability to build and deploy software applications on one or more public cloud providers (OCI, AWS, Azure, GCP, or similar). Demonstrated ability to write full-stack applications using polyglot programming with languages/frameworks like FastAPI, Python, and Golang. Experience designing API-first systems with application stacks like FARM and MERN, and technologies such as gRPC and REST. Solid understanding of Design Thinking, Test-Driven Development (TDD), BDD, and end-to-end SDLC. Experience in DevOps practices, including Kubernetes, CI/CD, Blue-Green, and Canary deployments. Experience with Microservice architecture patterns, including API Gateways, Event-Driven & Reactive Architecture, CQRS, and SAGA. Familiarity with OOP design principles (SOLID, DRY, KISS, Common Closure, and Module Encapsulation). Proven ability to design software systems using various design patterns (Creational, Structural, and Behavioral). Strong interpersonal skills and the ability to effectively communicate with business stakeholders. Demonstrated ability to drive technology adoption in AIML Solutions and CNCF software stack. Excellent analytical, problem-solving, communication, and leadership skills. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 5 days ago
6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Oracle Customer Success Services Building on the mindset that "Who knows Oracle …. better than Oracle?" Oracle Customer Success Services assists customers with their requirements for some of the most cutting-edge applications and solutions by utilizing the strengths of more than two decades of expertise in developing mission-critical solutions for enterprise customers and combining it with cutting-edge technology to provide our customers' speed, flexibility, resiliency, and security to enable customers to optimize their investment, minimize risk, and achieve more. The business was established with an entrepreneurial mindset and supports a vibrant, imaginative, and highly varied workplace. We are free of obligations, so we'll need your help to turn it into a premier engineering hub that prioritizes quality. Why? Oracle Customer Success Services Engineering is responsible for designing, building, and managing cutting-edge solutions, services, and core platforms to support the managed cloud business including but not limited to Oracle Cloud Infrastructure (OCI), Oracle Cloud Applications (SaaS) & Oracle Enterprise Applications. This position is for CSS Architecture Team, and we are searching for the finest and brightest technologists as we begin on the road of cloud-native digital transformation. We operate under a garage culture, rely on cutting-edge technology in our daily work, and provide a highly innovative, creative, and experimental work environment. We prefer to innovate and move quickly, putting a strong emphasis on scalability and robustness. We need your assistance to build a top-tier engineering team that has a significant influence. What? As a Principal Data Science & AIML Engineer within the CSS CDO Architecture & Platform team, you’ll lead efforts in designing and building scalable, distributed, resilient services that provide artificial intelligence and machine learning capabilities on OCI & Oracle Cloud Applications for the business. You will be responsible for the design and development of machine learning systems and applications, ensuring they meet the needs of our clients and align with the company's strategic objectives. The ideal candidate will have extensive experience in machine learning algorithms, model creation and evaluation, data engineering and data processing for large scale distributed systems, and software development methodologies. We strongly believe in ownership and challenging the status quo. We expect you to bring critical thinking and long-term design impact while building solutions and products defining system integrations, and cross-cutting concerns. Being part of the architecture function also provides you with the unique ability to enforce new processes and design patterns that will be future-proof while building new services or products. As a thought leader, you will own and lead the complete SDLC from Architecture Design, Development, Test, Operational Readiness, and Platform SRE. Responsibilities As a member of the architecture team, you will be in charge of designing software products, services, and platforms, as well as creating, testing, and managing the systems and applications we create in line with the architecture patterns and standards. As a core member of the Architecture Chapter, you will be expected to advocate for the adoption of software architecture and design patterns among cross-functional teams both within and outside of engineering roles. You will also be expected to act as a mentor and act in capacity as an advisor to the team(s) within the software and AIML domain. As we push for digital transformation throughout the organization, you will constantly be expected to think creatively and optimize and harmonize business processes. Core Responsibilities Lead the development of machine learning models, integration with full stack software ecosystem, data engineering and contribute to the design strategy. Collaborate with product managers and development teams to identify software requirements and define project scopes. Develop and maintain technical documentation, including architecture diagrams, design specifications, and system diagrams. Analyze and recommend new software technologies and platforms to ensure the company stays ahead of the curve. Work with development teams to ensure software projects are delivered on time, within budget, and to the required quality standards. Provide guidance and mentorship to junior developers. Stay up-to-date with industry trends and developments in software architecture and development practices. Required Qualifications Bachelor's or Master's Degree in Computer Science, Machine Learning/AI, or a closely related field. 6+ years of experience in software development, machine learning, data science, and data engineering design. Proven ability to build and manage enterprise-distributed and/or cloud-native systems. Broad knowledge of cutting-edge machine learning models and strong domain expertise in both traditional and deep learning, particularly in areas such as Recommendation Engines, NLP & Transformers, Computer Vision, and Generative AI. Advanced proficiency in Python and frameworks such as FastAPI, Dapr & Flask or equivalent. Deep experience with ML frameworks such as PyTorch, TensorFlow, and Scikit-learn. Hands-on experience building ML models from scratch, transfer learning, and Retrieval Augmented Generation (RAG) using various techniques (Native, Hybrid, C-RAG, Graph RAG, Agentic RAG, and Multi-Agent RAG). Experience building Agentic Systems with SLMs and LLMs using frameworks like Langgraph + Langchain, AutoGen, LlamaIndex, Bedrock, Vertex, Agent Development Kit, Model Context Protocol (MCP)and Haystack or equivalent. Experience in Data Engineering using data lakehouse stacks such as ETL/ELT, and data processing with Apache Hadoop, Spark, Flink, Beam, and dbt. Experience with Data Warehousing and Lakes such as Apache Iceberg, Hudi, Delta Lake, and cloud-managed solutions like OCI Data Lakehouse. Experience in data visualization and analytics with Apache Superset, Apache Zeppelin, Oracle Analytics Cloud or similar. Hands-on experience working with various data types and storage formats, including NoSQL, SQL, Graph databases, and data serialization formats like Parquet and Arrow. Experience with real-time distributed systems using streaming data with Kafka, NiFi, or Pulsar. Strong expertise in software design concepts, patterns (e.g., 12-Factor Apps), and tools to create CNCF-compliant software with hands-on knowledge of containerization technologies like Docker and Kubernetes. Proven ability to build and deploy software applications on one or more public cloud providers (OCI, AWS, Azure, GCP, or similar). Demonstrated ability to write full-stack applications using polyglot programming with languages/frameworks like FastAPI, Python, and Golang. Experience designing API-first systems with application stacks like FARM and MERN, and technologies such as gRPC and REST. Solid understanding of Design Thinking, Test-Driven Development (TDD), BDD, and end-to-end SDLC. Experience in DevOps practices, including Kubernetes, CI/CD, Blue-Green, and Canary deployments. Experience with Microservice architecture patterns, including API Gateways, Event-Driven & Reactive Architecture, CQRS, and SAGA. Familiarity with OOP design principles (SOLID, DRY, KISS, Common Closure, and Module Encapsulation). Proven ability to design software systems using various design patterns (Creational, Structural, and Behavioral). Strong interpersonal skills and the ability to effectively communicate with business stakeholders. Demonstrated ability to drive technology adoption in AIML Solutions and CNCF software stack. Excellent analytical, problem-solving, communication, and leadership skills. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 5 days ago
2.0 - 5.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Associate Tower: Data, Analytics & Specialist Managed Service Experience: 2.0 - 5.5 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India.;l Job Description As a Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 2 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 1-3 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less
Posted 5 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Job Description Summary The Data Scientist will work in teams addressing statistical, machine learning and data understanding problems in a commercial technology and consultancy development environment. In this role, you will contribute to the development and deployment of modern machine learning, operational research, semantic analysis, and statistical methods for finding structure in large data sets. Job Description Site Overview Established in 2000, the John F. Welch Technology Center (JFWTC) in Bengaluru is GE Aerospace’s multidisciplinary research and engineering center. Pushing the boundaries of innovation every day, engineers and scientists at JFWTC have contributed to hundreds of aviation patents, pioneering breakthroughs in engine technologies, advanced materials, and additive manufacturing. Role Overview As a Data Scientist, you will be part of a data science or cross-disciplinary team on commercially-facing development projects, typically involving large, complex data sets. These teams typically include statisticians, computer scientists, software developers, engineers, product managers, and end users, working in concert with partners in GE business units. Potential application areas include remote monitoring and diagnostics across infrastructure and industrial sectors, financial portfolio risk assessment, and operations optimization. In This Role, You Will Develop analytics within well-defined projects to address customer needs and opportunities. Work alongside software developers and software engineers to translate algorithms into commercially viable products and services. Work in technical teams in development, deployment, and application of applied analytics, predictive analytics, and prescriptive analytics. Perform exploratory and targeted data analyses using descriptive statistics and other methods. Work with data engineers on data quality assessment, data cleansing and data analytics Generate reports, annotated code, and other projects artifacts to document, archive, and communicate your work and outcomes. Share and discuss findings with team members. Required Qualifications Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with basic experience. Desired Characteristics Expertise in one or more programming languages and analytic software tools (e.g., Python, R, SAS, SPSS). Strong understanding of machine learning algorithms, statistical methods, and data processing techniques. Exceptional ability to analyze large, complex data sets and derive actionable insights. Proficiency in applying descriptive, predictive, and prescriptive analytics to solve real-world problems. Demonstrated skill in data cleansing, data quality assessment, and data transformation. Experience working with big data technologies and tools (e.g., Hadoop, Spark, SQL). Excellent communication skills, both written and verbal. Ability to convey complex technical concepts to non-technical stakeholders and collaborate effectively with cross-functional teams Demonstrated commitment to continuous learning and staying up-to-date with the latest advancements in data science, machine learning, and related fields. Active participation in the data science community through conferences, publications, or contributions to open-source projects. Ability to thrive in a fast-paced, dynamic environment and adapt to changing priorities and requirements. Flexibility to work on diverse projects across various domains. Preferred Qualifications Awareness of feature extraction and real-time analytics methods. Understanding of analytic prototyping, scaling, and solutions integration. Ability to work with large, complex data sets and derive meaningful insights. Familiarity with machine learning techniques and their application in solving real-world problems. Strong problem-solving skills and the ability to work independently and collaboratively in a team environment. Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Domain Knowledge Demonstrated awareness of industry and technology trends in data science Demonstrated awareness of customer and stakeholder management and business metrics Leadership Demonstrated awareness of how to function in a team setting Demonstrated awareness of critical thinking and problem solving methods Demonstrated awareness of presentation skills Personal Attributes Demonstrated awareness of how to leverage curiosity and creativity to drive business impact Humble: respectful, receptive, agile, eager to learn Transparent: shares critical information, speaks with candor, contributes constructively Focused: quick learner, strategically prioritizes work, committed Leadership ability: strong communicator, decision-maker, collaborative Problem solver: analytical-minded, challenges existing processes, critical thinker Whether we are manufacturing components for our engines, driving innovation in fuel and noise reduction, or unlocking new opportunities to grow and deliver more productivity, our GE Aerospace teams are dedicated and making a global impact. Join us and help move the aerospace industry forward . Additional Information Relocation Assistance Provided: No Show more Show less
Posted 5 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Summary We are seeking an Apache Hadoop - Subject Matter Expert (SME) who will be responsible for designing, optimizing, and scaling Spark-based data processing systems. This role involves hands-on experience in Spark architecture and core functionalities, focusing on building resilient, high-performance distributed data systems. You will collaborate with engineering teams to deliver high-throughput Spark applications and solve complex data challenges in real-time processing, big data analytics, and streaming. If you’re passionate about working in fast-paced, dynamic environments and want to be part of the cutting edge of data solutions, this role is for you. We’re Looking For Someone Who Can Design and optimize distributed Spark-based applications, ensuring low-latency, high-throughput performance for big data workloads. Troubleshooting: Provide expert-level troubleshooting for any data or performance issues related to Spark jobs and clusters. Data Processing Expertise: Work extensively with large-scale data pipelines using Spark's core components (Spark SQL, DataFrames, RDDs, Datasets, and structured streaming). Performance Tuning: Conduct deep-dive performance analysis, debugging, and optimization of Spark jobs to reduce processing time and resource consumption. Cluster Management: Collaborate with DevOps and infrastructure teams to manage Spark clusters on platforms like Hadoop/YARN, Kubernetes, or cloud platforms (AWS EMR, GCP Dataproc, etc.). Real-time Data: Design and implement real-time data processing solutions using Apache Spark Streaming or Structured Streaming. What Makes You The Right Fit For This Position Expert in Apache Spark: In-depth knowledge of Spark architecture, execution models, and the components (Spark Core, Spark SQL, Spark Streaming, etc.) Data Engineering Practices: Solid understanding of ETL pipelines, data partitioning, shuffling, and serialization techniques to optimize Spark jobs. Big Data Ecosystem: Knowledge of related big data technologies such as Hadoop, Hive, Kafka, HDFS, and YARN. Performance Tuning and Debugging: Demonstrated ability to tune Spark jobs, optimize query execution, and troubleshoot performance bottlenecks. Experience with Cloud Platforms: Hands-on experience in running Spark clusters on cloud platforms such as AWS, Azure, or GCP. Containerization & Orchestration: Experience with containerized Spark environments using Docker and Kubernetes is a plus. Good To Have Certification in Apache Spark or related big data technologies. Experience working with Acceldata's data observability platform or similar tools for monitoring Spark jobs. Demonstrated experience with scripting languages like Bash, PowerShell, and Python. Familiarity with concepts related to application, server, and network security management. Possession of certifications from leading Cloud providers (AWS, Azure, GCP), and expertise in Kubernetes would be significant advantages. Show more Show less
Posted 5 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: Junior Designer Location: Mumbai, India About the role: Working in true collaboration with our client, we have one goal in mind: ‘to be the leading agency partner for the development of stunning and effective needs based content and digital media campaigns’. These brands are at the forefront of contemporary thinking, utilising in depth insight for digital strategy and content advertising. They are also dedicated to sustainability and foregrounding a brand purpose, which means the work we do with them requires working creative thinking into every brief, regardless of scale. We want to create industry-leading, world class work that’s truly beautiful, smart, and effective. To help us achieve our goal, we are looking for a strong, highly motivated and conceptual designer with beauty and BPC creds to join us in this exciting ambition. As Graphic Designer you will be an experienced digitally focused designer, comfortable taking design projects from brief through to completion, providing new ideas and creativity whilst working closely with the client’s brand guidelines. You will be producing design to the client’s brief and exacting standards whilst positively influencing clients with your creative input in addition to undertaking and pitching new creative concepts. You will be expected to work on a range of creatives from e-commerce to social media, bringing simplicity in design to the most complex briefs. What you will be doing: Producing beautiful and innovative designs for our client’s websites, e-commerce (A+ and Shopalyst) pages and social media channels. To develop a deep understanding of target audiences and the client’s marketing strategy to deliver high quality results that have an instant, positive impact on the consumer, promoting products and brands. To work closely with Sr.Designer to support pitch creative solutions in response to marketing strategies. To manage the preparation of all finished files that will comply with the correct output specifications. What you need to be great in this role: Minimum 1yr experience as a digital designer(social media) and passion for design and conception. You'll have a deep affinity with layout, typography and idea-generation. Attention to detail with the ability to work under own initiative. Confident in bringing your own innovative ideas and creativity to projects whilst working within a broad range of design guidelines across a variety of design collateral. Effectively organising and prioritising workloads to manage client delivery. Excellent Adobe CS skills. Understanding of design principles and an awareness of UX, UI and Responsive Design trends - it is ideal but not essential. Working knowledge of After Effects animation is a bonus. Retouching experience and skills to mid-level. Expertise in beauty and BPC brands Confident and comfortable working in a fast-paced, changing client environment. A passionate and inspiring creative. The aptitude to learn new software and programmes efficiently and effectively. Self-motivated, working with little supervision. Collaborative team player, open minded – non-political. Discrete about all confidential and personal information. Driven, proactive, helpful, and enthusiastic team player. Passion for and inquisitive about AI and new technologies Understanding and knowledge of AI tools is beneficial, but ability to learn and digest benefits and features of AI tools is critical. Req ID: 13110 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle. Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
India
On-site
Currently we have an open position with our client - Its a IT Consulting Firm Principal Databricks Engineer/Architect. Key Responsibilities: 1. Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements. 2. Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks. 3. Data Analytics: Collaborate with data scientists and analysts to develop and deploy machine learning models and analytics solutions on Databricks. 4. Performance Optimization: Optimize Databricks cluster performance, ensuring efficient resource utilization and cost-effectiveness. 5. Security and Governance: Implement Databricks security features, ensure data governance, and maintain compliance with industry regulations. 6. Client Engagement: Work closely with clients to understand their business requirements, provide technical guidance, and deliver high-quality Databricks solutions. 7. Thought Leadership: Stay up-to-date with the latest Databricks features, best practices, and industry trends, and share knowledge with the team. Requirements: 1. Databricks Experience: 5+ years of experience working with Databricks, including platform architecture, data engineering, and data analytics. 2. Technical Skills: Proficiency in languages such as Python, Scala, or Java, and experience with Databricks APIs, Spark, and Delta Lake. 3. Data Engineering: Strong background in data engineering, including data warehousing, ETL, and data governance. 4. Leadership: Proven experience leading technical teams, mentoring junior engineers, and driving technical initiatives. 5. Communication: Excellent communication and interpersonal skills, with the ability to work effectively with clients and internal stakeholders. Good to Have: 1. Certifications: Databricks Certified Professional or similar certifications. 2. Cloud Experience: Experience working with cloud platforms such as AWS, Azure, or GCP. 3. Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries. Show more Show less
Posted 5 days ago
0 years
0 Lacs
India
On-site
Ready to be pushed beyond what you think you’re capable of? At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system. To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems. Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be. The mission of the Platform Product Group engineers is to build a trusted, scalable and compliant platform to operate with speed, efficiency and quality. Our teams build and maintain the platforms critical to the existence of Coinbase. There are many teams that make up this group which include Product Foundations (i.e. Identity, Payment, Risk, Proofing & Regulatory, Finhub), Machine Learning, Customer Experience, and Infrastructure. As a machine learning engineer, you will play a pivotal role in constructing essential infrastructure for the open financial system. This involves harnessing diverse and extensive data sources, including the blockchain, to grant millions of individuals access to cryptocurrency while simultaneously identifying and thwarting malicious entities. Your impact extends beyond safeguarding Coinbase, as you'll have the opportunity to employ machine learning to enhance the overall user experience. This includes imbuing intelligence into recommendations, risk assessment, chatbots, and various other aspects, making our product not only secure but also exceptionally user-friendly. What you’ll be doing (ie. job duties): Investigate and harness cutting-edge machine learning methodologies, including deep learning, large language models (LLMs), and graph neural networks, to address diverse challenges throughout the company. These challenges encompass areas such as fraud detection, feed ranking, recommendation systems, targeting, chatbots, and blockchain mining. Develop and deploy robust, low-maintenance applied machine learning solutions in a production environment. Create onboarding codelabs, tools, and infrastructure to democratize access to machine learning resources across Coinbase, fostering a culture of widespread ML utilization. What we look for in you (ie. job requirements): 5+yrs of industry experience as a machine learning and software engineer Experience building backend systems at scale with a focus on data processing/machine learning/analytics. Experience with at least one ML model: LLMs, GNN, Deep Learning, Logistic Regression, Gradient Boosting trees, etc. Working knowledge in one or more of the following: data mining, information retrieval, advanced statistics or natural language processing, computer vision. Exhibit our core cultural values: add positive energy, communicate clearly, be curious, and be a builder. Nice to haves: BS, MS, PhD degree in Computer Science, Machine Learning, Data Mining, Statistics, or related technical field. Knowledge of Apache Airflow, Spark, Flink, Kafka/Kinesis, Snowflake, Hadoop, Hive. Experience with Python. Experience with model interpretability, responsible AI. Experience with data analysis and visualization. Job #: GPML05IN *Answers to crypto-related questions may be used to evaluate your onchain experience. Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying. Commitment to Equal Opportunity Coinbase is committed to diversity in its workforce and is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the Know Your Rights notice here . Additionally, Coinbase participates in the E-Verify program in certain locations, as required by law. Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations[at]coinbase.com to let us know the nature of your request and your contact information. For quick access to screen reading technology compatible with this site click here to download a free compatible screen reader (free step by step tutorial can be found here) . Global Data Privacy Notice for Job Candidates and Applicants Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available here. By submitting your application, you are agreeing to our use and processing of your data as required. For US applicants only, by submitting your application you are agreeing to arbitration of disputes as outlined here. Show more Show less
Posted 5 days ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 5 days ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
Role: Senior Data Engineer with Databricks. Experience: 5+ Years Job Type: Contract Contract Duration: 6 Months Budget: 1.0 lakh per month Location : Remote JOB DESCRIPTION: We are looking for a dynamic and experienced Senior Data Engineer – Databricks to design, build, and optimize robust data pipelines using the Databricks Lakehouse platform. The ideal candidate should have strong hands-on skills in Apache Spark, PySpark, cloud data services, and a good grasp of Python and Java. This role involves close collaboration with architects, analysts, and developers to deliver scalable and high-performing data solutions across AWS, Azure, and GCP. ESSENTIAL JOB FUNCTIONS 1. Data Pipeline Development • Build scalable and efficient ETL/ELT workflows using Databricks and Spark for both batch and streaming data. • Leverage Delta Lake and Unity Catalog for structured data management and governance. • Optimize Spark jobs by tuning configurations, caching, partitioning, and serialization techniques. 2. Cloud-Based Implementation • Develop and deploy data workflows onAWS (S3, EMR,Glue), Azure (ADLS, ADF, Synapse), and/orGCP (GCS, Dataflow, BigQuery). • Manage and optimize data storage, access control, and pipeline orchestration using native cloud tools. • Use tools like Databricks Auto Loader and SQL Warehousing for efficient data ingestion and querying. 3. Programming & Automation • Write clean, reusable, and production-grade code in Python and Java. • Automate workflows using orchestration tools(e.g., Airflow, ADF, or Cloud Composer). • Implement robust testing, logging, and monitoring mechanisms for data pipelines. 4. Collaboration & Support • Collaborate with data analysts, data scientists, and business users to meet evolving data needs. • Support production workflows, troubleshoot failures, and resolve performance bottlenecks. • Document solutions, maintain version control, and follow Agile/Scrum processes Required Skills Technical Skills: • Databricks: Hands-on experience with notebooks, cluster management, Delta Lake, Unity Catalog, and job orchestration. • Spark: Expertise in Spark transformations, joins, window functions, and performance tuning. • Programming: Strong in PySpark and Java, with experience in data validation and error handling. • Cloud Services: Good understanding of AWS, Azure, or GCP data services and security models. • DevOps/Tools: Familiarity with Git, CI/CD, Docker (preferred), and data monitoring tools. Experience: • 5–8 years of data engineering or backend development experience. • Minimum 1–2 years of hands-on work in Databricks with Spark. • Exposure to large-scale data migration, processing, or analytics projects. Certifications (nice to have): Databricks Certified Data Engineer Associate Working Conditions Hours of work - Full-time hours; Flexibility for remote work with ensuring availability during US Timings. Overtime expectations - Overtime may not be required as long as the commitment is accomplished Work environment - Primarily remote; occasional on-site work may be needed only during client visit. Travel requirements - No travel required. On-call responsibilities - On-call duties during deployment phases. Special conditions or requirements - Not Applicable. Workplace Policies and Agreements Confidentiality Agreement: Required to safeguard client sensitive data. Non-Compete Agreement: Must be signed to ensure proprietary model security. Non-Disclosure Agreement: Must be signed to ensure client confidentiality and security. Show more Show less
Posted 5 days ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Circle K (Part of Alimentation Couche-Tard Inc., (ACT)) is a global Fortune 200 company. A leader in the convenience store and fuel space, it has a footprint across 31 countries and territories. Circle K India Data & Analytics team is an integral part of ACT’s Global Data & Analytics Team and the Lead Data Analyst will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. About The Role The incumbent will be responsible for deploying analytics algorithms and tools on chosen tech stack for efficient and effective delivery. Responsibilities include delivering insights and targeted action plans, address specific areas of risk and opportunity, work cross-functionally with business and technology teams, and leverage the support of global teams for analysis and data. Roles & Responsibilities Analytics (Data & Insights) Evaluate performance of categories and activities, using proven and advanced analytical methods Support stakeholders with actionable insights based on transactional, financial or customer data on an ongoing basis Oversee the design and measurement of experiments and pilots Initiate and conduct advanced analytics projects such as clustering, forecasting, causal impact Build highly impactful and intuitive dashboards that bring the underlying data to life through insights Operational Excellence Improve data quality by using and improving tools to automatically detect issues Develop analytical solutions or dashboards using user-centric design techniques in alignment with ACT’s protocol Study industry/organization benchmarks and design/develop analytical solutions to monitor or improve business performance across retail, marketing, and other business areas Stakeholder Management Work with Peers, Functional Consultants, Data Engineers, and cross-functional teams to lead / support the complete lifecycle of analytical applications, from development of mock-ups and storyboards to complete production ready application Provide regular updates to stakeholders to simplify and clarify complex concepts, and communicate the output of work to business Create compelling documentation or artefacts that connects business to the solutions Coordinate internally to share key learning with other teams and lead to accelerated business performance Be an advocate for a data-driven culture among the stakeholders Job Requirements Education Bachelor’s degree required , preferably in an analytical discipline like Finance, Mathematics, Statistics, Engineering, or similar Relevant Experience Experience: 7+ years for Lead Data Analyst Relevant working experience in a quantitative/applied analytics role Experience with programming, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Spark / SQL / Python Experience in leading projects and/or leading and mentoring small teams is a plus Excellent communication skills in English, both verbal and written Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Functional Analytics (Retail Analytics, Supply Chain Analytics, Marketing Analytics, Customer Analytics, etc.) Working understanding of Statistical modelling & Time Series Analysis using Analytical tools (Python, PySpark, R, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), database management systems Business intelligence & reporting (Power BI) Cloud computing services in Azure/AWS/GCP for analytics Show more Show less
Posted 5 days ago
1.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We're looking for a Product Support Specialist This role is Office Based, Pune Office GCS - Product Support Engineer (L1) | Night Shift ( US EST & PST ) | Pune | Office Based The Product Support Engineer is part of the Global Customer Support Team, responsible for providing Level 1 support to clients on the Cornerstone OnDemand products. Product Support Engineer works via telephone and electronic communication (CRM) with clients to acknowledge, analyse, and resolve complex application software related questions and troubleshoot issues encountered in applications. This position requires a hands-on individual, who can passionately and patiently educate our clients on, how our product is designed to work, and excels in problem solving skills, has eagerness to learn and brings customer centric mindset. In this role you will Provide day to day functional and technical software application support in a 24x7 environment to our clients including functionality testing and troubleshooting as needed. Ensure proper, timely, and ongoing follow-up on assigned cases to ensure service level agreements (SLA) are met and client satisfaction is high (CSAT). Time-bound and superior customer communication over CRM (salesforce), phone and email to prevent case staleness/ageing and preventing backlog. Follow work on hand prioritization while dealing with cases carrying high customer impact and deliver time bound resolution in line with customer expectations. Where necessary, time bound engagement of next level support with proper triage and case documentation. Attain and maintain product certification on Cornerstone products in adherence with Cornerstone’s Product Certification Program achieving deep understanding and expertise over Cornerstone applications. Active engagement with Knowledge base and forums utilizing help channels/resources. Consistently deliver aligning with set goals and beyond Collaborate with team members from all around the world. Consideration of privacy and security obligation You've Got What It Takes If You Have Bachelor’s degree in computer science or equivalent with 1-3 years of customer facing application support experience (Preferably SaaS environment) Hands-on experience working on and debugging issues with access management, Single-Sign-On, etc. Basic awareness about SaaS, cloud computing, FTP, SSO, SMTP, HTML, etc. Highly organized with understanding of processes, SLA's and tools used in product support ecosystem. Superior written and verbal communication skills. Customer centric mindset, with passion for helping customers and providing excellent customer service. Positive Attitude with ability to thinking out of the box. Patient, Organized, Composed and Good Listener, thoughtfully responding to any situation. Strong analytical and problem-solving skills. Strong team player promoting and influencing positive team spirit towards inclusive success. The role requires working in 24x7 environment (mostly US shifts) Our Culture Spark Greatness. Shatter Boundaries. Share Success. Are you ready? Because here, right now – is where the future of work is happening. Where curious disruptors and change innovators like you are helping communities and customers enable everyone – anywhere – to learn, grow and advance. To be better tomorrow than they are today. Who We Are Cornerstone powers the potential of organizations and their people to thrive in a changing world. Cornerstone Galaxy, the complete AI-powered workforce agility platform, meets organizations where they are. With Galaxy, organizations can identify skills gaps and development opportunities, retain and engage top talent, and provide multimodal learning experiences to meet the diverse needs of the modern workforce. More than 7,000 organizations and 100 million+ users in 180+ countries and in nearly 50 languages use Cornerstone Galaxy to build high-performing, future-ready organizations and people today. Check us out on LinkedIn , Comparably , Glassdoor , and Facebook ! Show more Show less
Posted 5 days ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Experience with Apache Spark (PySpark): In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred Technical And Professional Experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing Show more Show less
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for professionals with expertise in Spark is on the rise in India. Spark, an open-source distributed computing system, is widely used for big data processing and analytics. Job seekers in India looking to explore opportunities in Spark can find a variety of roles in different industries.
These cities have a high concentration of tech companies and startups actively hiring for Spark roles.
The average salary range for Spark professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
Salaries may vary based on the company, location, and specific job requirements.
In the field of Spark, a typical career progression may look like: - Junior Developer - Senior Developer - Tech Lead - Architect
Advancing in this career path often requires gaining experience, acquiring additional skills, and taking on more responsibilities.
Apart from proficiency in Spark, professionals in this field are often expected to have knowledge or experience in: - Hadoop - Java or Scala programming - Data processing and analytics - SQL databases
Having a combination of these skills can make a candidate more competitive in the job market.
As you explore opportunities in Spark jobs in India, remember to prepare thoroughly for interviews and showcase your expertise confidently. With the right skills and knowledge, you can excel in this growing field and advance your career in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2