Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a member of the Infosys consulting team, you will play a crucial role in addressing customer issues, identifying problem areas, devising creative solutions, and overseeing their implementation to ensure customer satisfaction. Your responsibilities will include developing proposals, contributing to solution design, configuring products, conducting pilot sessions, and resolving queries related to requirements and solution design. You will be involved in conducting solution demonstrations, workshops, and preparing effort estimates aligned with customer budgetary constraints and organizational financial guidelines. Leading small projects and participating in unit-level and organizational initiatives will be part of your role to deliver high-quality and valuable solutions to clients as they embark on their digital transformation journey. Your skill set should include the ability to devise innovative strategies that drive client innovation, growth, and profitability, along with a good understanding of software configuration management systems and awareness of the latest technologies and industry trends. Logical thinking, problem-solving abilities, collaboration skills, and a grasp of financial processes and pricing models for projects are essential for success in this role. Additionally, you should have expertise in assessing current processes, identifying areas for improvement, and proposing technology solutions. Possessing knowledge in one or more industry domains, client interfacing skills, and experience in project and team management will be beneficial in excelling in this position at Infosys.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a ClickHouse Database Specialist, you will be responsible for helping build production-grade systems based on ClickHouse. This includes advising on how to design schemas, plan clusters, and work on infrastructure projects related to ClickHouse. You will be working on diverse environments ranging from single node setups to clusters with hundreds of nodes. Additionally, you will be involved in improving ClickHouse itself by fixing bugs, enhancing documentation, creating test cases, and studying new usage patterns, functions, and integration with other products. Your role will also entail the installation, configuration, backup, recovery, and maintenance of multiple node clusters in ClickHouse database. Monitoring and optimizing database performance to ensure high availability and responsiveness will be a key aspect of your responsibilities. Troubleshooting database issues, identifying and resolving performance bottlenecks, designing and implementing backup and recovery strategies, and developing database security policies and procedures will be part of your daily tasks. Collaborating with development teams to optimize database schema design and queries, providing technical guidance and support to development and operations teams, and handling support calls from customers using ClickHouse will be crucial components of this role. Furthermore, having experience with big data stack components like Hadoop, Spark, Kafka, Nifi, as well as data science and data analysis, will be beneficial. Knowledge of SRE/DevOps stacks, monitoring, and system management tools such as Prometheus, Ansible, ELK, and version control using git are also desired skills for this position. In summary, as a ClickHouse Database Specialist, you will play a vital role in ensuring the efficient operation and optimization of ClickHouse databases, contributing to the overall success of production-grade systems and infrastructure projects.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
You will be responsible for designing, developing, and maintaining scalable data pipelines using Azure Databricks. Your role will involve building and optimizing ETL/ELT processes for structured and unstructured data, collaborating with data scientists, analysts, and business stakeholders, integrating Databricks with Azure Data Lake, Synapse, Data Factory, and Blob Storage, developing real-time data streaming pipelines, and managing data models/data warehouses. Additionally, you will optimize performance, manage resources, ensure cost efficiency, implement best practices for data governance, security, and quality, troubleshoot and improve existing data workflows, contribute to architecture and technology strategy, mentor junior team members, and maintain documentation. To excel in this role, you should have a Bachelor's/Master's degree in Computer Science, IT, or a related field, along with 5+ years of Data Engineering experience (minimum 2+ years with Databricks). Strong expertise in Azure cloud services (Data Lake, Synapse, Data Factory), proficiency in Spark (PySpark/Scala) and big data processing, experience with Delta Lake, Structured Streaming, and real-time pipelines, strong SQL skills, an understanding of data modeling and warehousing, familiarity with DevOps tools like CI/CD, Git, Terraform, Azure DevOps, excellent problem-solving and communication skills are essential. Preferred qualifications include Databricks Certified (Associate/Professional), experience with machine learning workflows on Databricks, knowledge of data governance tools like Purview, experience with REST APIs, Kafka, Event Hubs, cloud performance tuning, and cost optimization experience. Join us to be a part of a supportive and collaborative team, work with a growing company in the exciting BI and Data industry, enjoy a competitive salary and performance-based bonuses, and have opportunities for professional growth and development. If you are interested in this opportunity, please send your resume to hr@exillar.com and fill out the form at https://forms.office.com/r/HdzMNTaagw.,
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
delhi
On-site
As a seasoned data engineering professional with 10+ years of experience, you will lead and mentor a team of data engineers to ensure high performance and career growth. Your primary responsibility will be to architect and optimize scalable data infrastructure, guaranteeing high availability and reliability. Additionally, you will drive the development and implementation of data governance frameworks and best practices, collaborating closely with cross-functional teams to define and execute a data roadmap. Your expertise in backend development using languages like Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS will be crucial. Proficiency in SQL, Python, and Scala for data processing and analytics is a must. In-depth knowledge of cloud platforms such as AWS, GCP, or Azure is required, along with hands-on experience in big data technologies like Spark, Hadoop, Kafka, and distributed computing frameworks. You will be responsible for ensuring data security, compliance, and quality across all data platforms while optimizing data processing workflows for performance and cost efficiency. A strong foundation in High-Level Design (HLD) and Low-Level Design (LLD), as well as design patterns, preferably using Spring Boot or Google Guice, is necessary. Experience with data warehousing solutions like Snowflake, Redshift, or BigQuery will be beneficial. Your role will also involve working with NoSQL databases such as Redis, Cassandra, MongoDB, and TiDB, as well as familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy aligned with business objectives, strong leadership, communication, and stakeholder management skills are essential for this position. Candidates from Tier 1 colleges/universities with a background in product startups and experience in implementing Data Engineering systems from an early stage in the company are preferred. Additionally, experience in machine learning infrastructure or MLOps, exposure to real-time data processing and analytics, and interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture will be advantageous. Prior experience in a SaaS or high-growth tech company will be a plus. If you are a highly skilled data engineer with a passion for innovation and technical excellence, we invite you to apply for this challenging and rewarding opportunity.,
Posted 1 week ago
5.0 years
0 Lacs
Haryana, India
On-site
Job Description About TaskUs: TaskUs is a provider of outsourced digital services and next-generation customer experience to fast-growing technology companies, helping its clients represent, protect and grow their brands. Leveraging a cloud-based infrastructure, TaskUs serves clients in the fastest-growing sectors, including social media, e-commerce, gaming, streaming media, food delivery, ride-sharing, HiTech, FinTech, and HealthTech. The People First culture at TaskUs has enabled the company to expand its workforce to approximately 45,000 employees globally. Presently, we have a presence in twenty-three locations across twelve countries, which include the Philippines, India, and the United States. It started with one ridiculously good idea to create a different breed of Business Processing Outsourcing (BPO)! We at TaskUs understand that achieving growth for our partners requires a culture of constant motion, exploring new technologies, being ready to handle any challenge at a moment's notice, and mastering consistency in an ever-changing world. What We Offer: At TaskUs, we prioritize our employees' well-being by offering competitive industry salaries and comprehensive benefits packages. Our commitment to a People First culture is reflected in the various departments we have established, including Total Rewards, Wellness, HR, and Diversity. We take pride in our inclusive environment and positive impact on the community. Moreover, we actively encourage internal mobility and professional growth at all stages of an employee's career within TaskUs. Join our team today and experience firsthand our dedication to supporting People First. Job Description Summary Data Scientist with deep expertise in modern AI/ML technologies to join our innovative team. This role combines cutting-edge research in machine learning, deep learning, and generative AI with practical full-stack cloud development skills. You will be responsible for architecting and implementing end-to-end AI solutions, from data engineering pipelines to production-ready applications leveraging the latest in agentic AI and large language models. Job Description Key Responsibilities AI/ML Development & Research Design, develop, and deploy advanced machine learning and deep learning models for complex business problems Implement and optimize Large Language Models (LLMs) and Generative AI solutions Build agentic AI systems with autonomous decision-making capabilities Conduct research on emerging AI technologies and their practical applications Perform model evaluation, validation, and continuous improvement Cloud Infrastructure & Full-Stack Development Architect and implement scalable cloud-native ML/AI solutions on AWS, Azure, or GCP Develop full-stack applications integrating AI models with modern web technologies Build and maintain ML pipelines using cloud services (SageMaker, ML Engine, etc.) Implement CI/CD pipelines for ML model deployment and monitoring Design and optimize cloud infrastructure for high-performance computing workloads Data Engineering & Database Management Design and implement data pipelines for large-scale data processing Work with both SQL and NoSQL databases (PostgreSQL, MongoDB, Cassandra, etc.) Optimize database performance for ML workloads and real-time applications Implement data governance and quality assurance frameworks Handle streaming data processing and real-time analytics Leadership & Collaboration Mentor junior data scientists and guide technical decision-making Collaborate with cross-functional teams including product, engineering, and business stakeholders Present findings and recommendations to technical and non-technical audiences Lead proof-of-concept projects and innovation initiatives Required Qualifications Education & Experience Master's or PhD in Computer Science, Data Science, Statistics, Mathematics, or related field 5+ years of hands-on experience in data science and machine learning 3+ years of experience with deep learning frameworks and neural networks 2+ years of experience with cloud platforms and full-stack development Technical Skills - Core AI/ML Machine Learning: Scikit-learn, XGBoost, LightGBM, advanced ML algorithms Deep Learning: TensorFlow, PyTorch, Keras, CNN, RNN, LSTM, Transformers Large Language Models: GPT, BERT, T5, fine-tuning, prompt engineering Generative AI: Stable Diffusion, DALL-E, text-to-image, text generation Agentic AI: Multi-agent systems, reinforcement learning, autonomous agents Technical Skills - Development & Infrastructure Programming: Python (expert), R, Java/Scala, JavaScript/TypeScript Cloud Platforms: AWS (SageMaker, EC2, S3, Lambda), Azure ML, or Google Cloud AI Databases: SQL (PostgreSQL, MySQL), NoSQL (MongoDB, Cassandra, DynamoDB) Full-Stack Development: React/Vue.js, Node.js, FastAPI, Flask, Docker, Kubernetes MLOps: MLflow, Kubeflow, Model versioning, A/B testing frameworks Big Data: Spark, Hadoop, Kafka, streaming data processing Preferred Qualifications Experience with vector databases and embeddings (Pinecone, Weaviate, Chroma) Knowledge of LangChain, LlamaIndex, or similar LLM frameworks Experience with model compression and edge deployment Familiarity with distributed computing and parallel processing Experience with computer vision and NLP applications Knowledge of federated learning and privacy-preserving ML Experience with quantum machine learning Expertise in MLOps and production ML system design Key Competencies Technical Excellence Strong mathematical foundation in statistics, linear algebra, and optimization Ability to implement algorithms from research papers Experience with model interpretability and explainable AI Knowledge of ethical AI and bias detection/mitigation Problem-Solving & Innovation Strong analytical and critical thinking skills Ability to translate business requirements into technical solutions Creative approach to solving complex, ambiguous problems Experience with rapid prototyping and experimentation Communication & Leadership Excellent written and verbal communication skills Ability to explain complex technical concepts to diverse audiences Strong project management and organizational skills Experience mentoring and leading technical teams How We Partner To Protect You: TaskUs will neither solicit money from you during your application process nor require any form of payment in order to proceed with your application. Kindly ensure that you are always in communication with only authorized recruiters of TaskUs. DEI: In TaskUs we believe that innovation and higher performance are brought by people from all walks of life. We welcome applicants of different backgrounds, demographics, and circumstances. Inclusive and equitable practices are our responsibility as a business. TaskUs is committed to providing equal access to opportunities. If you need reasonable accommodations in any part of the hiring process, please let us know. We invite you to explore all TaskUs career opportunities and apply through the provided URL https://www.taskus.com/careers/ . TaskUs is proud to be an equal opportunity workplace and is an affirmative action employer. We celebrate and support diversity; we are committed to creating an inclusive environment for all employees. TaskUs people first culture thrives on it for the benefit of our employees, our clients, our services, and our community. Req Id: R_2507_10290_0 Posted At: Thu Jul 31 2025 00:00:00 GMT+0000 (Coordinated Universal Time)
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
About Marriott: Marriott Tech Accelerator is part of Marriott International, a global leader in hospitality. Marriott International, Inc. is a leading American multinational company that operates a vast array of lodging brands, including hotels and residential properties. It consists of over 30 well-known brands and nearly 8,900 properties situated in 141 countries and territories. Role Title: Security Data Scientist Position Summary: Marriott International's Global Information Security is seeking an experienced Security Data Scientist who can combine expertise in cybersecurity with data science skills to analyze and protect Marriott's digital assets. Job Responsibilities: Perform data cleaning, analysis, and modeling tasks. Work under guidance of senior team members to: Analyze large datasets related to cybersecurity threats and incidents. Implement existing machine learning models and algorithms to detect anomalies and potential security breaches. Support SDL tools (e.g., big data, ML/AI technologies). Create data visualizations and reports to communicate insights to stakeholders. Collaborate with cybersecurity teams to implement data-driven security solutions. Stay up to date with the latest cyber threats and data science techniques. Help to maintain and document SDL MLOps processes and procedures. Skill and Experience: 2-4 years of data science, data analytics, data management, and/or information security experience that includes: 2+ years of experience in data science/data analytics in an enterprise environment. 1+ years of experience in information protection/information security. Strong background in statistics, mathematics, and software engineering (e.g., Proficiency in Python, R). Experience with machine learning algorithms and frameworks as well as AI techniques. Knowledge of cybersecurity principles, tools, and best practices. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies. Understanding of data visualization tools like Power BI. Preferred: Programming languages: Python, R, SQL. Machine learning frameworks: TensorFlow, PyTorch, scikit-learn. Big data technologies: Hadoop, Spark, and Kafka. Cloud platforms: AWS, Azure, GCP. Data visualization tools: Tableau, Power BI. Relevant certifications such as data science certifications, CISSP, CEH. Verbal and written communication skills. Education and Certifications: Bachelor's degree in computer/data science, information management, Cybersecurity, or related field or equivalent experience/certification. Work location: Hyderabad, India. Work mode: Hybrid.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The role of warehousing and logistics systems is becoming increasingly crucial in enhancing the competitiveness of various companies and contributing to the overall efficiency of the global economy. Modern intra-logistics solutions integrate cutting-edge mechatronics, sophisticated software, advanced robotics, computational perception, and AI algorithms to ensure high throughput and streamlined processing for critical commercial logistics functions. Our Warehouse Execution Software is designed to optimize intralogistics and warehouse automation by utilizing advanced optimization techniques. By synchronizing discrete logistics processes, we have created a real-time decision engine that maximizes labor and equipment efficiency. Our software empowers customers with operational agility essential for meeting the demands of an Omni-channel environment. We are seeking a dynamic individual who can develop state-of-the-art MLOps and DevOps frameworks for AI model deployment. The ideal candidate should possess expertise in cloud technologies, deployment architectures, and software production standards. Moreover, effective collaboration within interdisciplinary teams is key to successfully guiding products through the development cycle. **Core Job Responsibilities:** - Develop comprehensive pipelines covering the ML lifecycle from data ingestion to model evaluation. - Collaborate with AI scientists to expedite the operationalization of ML algorithms. - Establish CI/CD/CT pipelines for ML algorithms. - Implement model deployment both in cloud and on-premises edge environments. - Lead a team of DevOps/MLOps engineers. - Stay updated on new tools, technologies, and industry best practices. **Key Qualifications:** - Master's degree in Computer Science, Software Engineering, or a related field. - Proficiency in Cloud Platforms, particularly GCP, and relevant skills like Docker, Kubernetes, and edge computing. - Familiarity with task orchestration tools such as MLflow, Kubeflow, Airflow, Vertex AI, and Azure ML. - Strong programming skills, preferably in Python. - Robust DevOps expertise including Linux/Unix, testing, automation, Git, and build tools. - Knowledge of data engineering tools like Beam, Spark, Pandas, SQL, and GCP Dataflow is advantageous. - Minimum 5 years of experience in relevant fields, including academic exposure. - At least 3 years of experience in managing a DevOps/MLOps team.,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Architect, you will be responsible for leading, analyzing, designing, and delivering analytics solutions and applications, including statistical data models, reports, and dashboards in cloud environments such as AWS, Azure, and GCP, as well as corresponding cloud-based EDW database platforms like Snowflake, Redshift, and BigQuery. You must have a minimum of 8 years of experience, with at least 3 years in the role of a data architect for Data Warehouse and Analytics solutions. Your role will involve leveraging your 3+ years of experience with cloud platforms (AWS, Azure, GCP) and a strong understanding of the ingestion and consumption processes in Data Lakes. You should also have 3+ years of experience in cloud-based EDW platforms such as Snowflake, Redshift, BigQuery, or Synapse, and be adept at building and launching new data models that provide intuitive analytics for analysts and customers. In this position, you will be expected to work with and analyze large datasets within the relevant domains of enterprise data, as well as demonstrate strong experience in Data Warehouse ETL design and development, methodologies, tools, processes, and best practices. Proficiency in writing complex SQL, PL/SQL, UNIX scripts, and understanding of performance tuning and troubleshooting aspects are also crucial aspects of this role. Furthermore, you should possess good communication and presentation skills, with a proven track record of using insights to influence executives and colleagues. Additionally, having awareness or expertise in data security, data access controls, DevOps tools, and development frameworks like SCRUM/Agile will be beneficial. Your responsibilities will also include recommending solutions to improve cloud and existing Datawarehouse solutions, as well as showcasing the new capabilities of advanced analytics to business and technology teams to demonstrate the potential of the Data platform. Overall, your leadership abilities will be essential in driving cross-functional development on new solutions from design through delivery. (ref: hirist.tech),
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This is a data engineer position - a programmer responsible for the design, development implementation and maintenance of data flow channels and data processing systems that support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. The overall objective is defining optimal solutions to data collection, processing, and warehousing. Must be a Spark Java development expertise in big data processing, Python and Apache spark particularly within banking & finance domain. He/She designs, codes and tests data systems and works on implementing those into the internal infrastructure. Responsibilities: Ensuring high quality software development, with complete documentation and traceability Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance Ensure efficient data storage and retrieval using Big Data Implement best practices for spark performance tuning including partition, caching and memory management Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins) Work on batch processing frameworks for Market risk analytics Promoting unit/functional testing and code inspection processes Work with business stakeholders and Business Analysts to understand the requirements Work with other data scientists to understand and interpret complex datasets Qualifications: 5- 8 Years of experience in working in data eco systems. 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks. 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning) Experienced in working with large and multiple datasets and data warehouses Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets. Strong analytic skills and experience working with unstructured datasets Ability to effectively use complex analytical, interpretive, and problem-solving techniques Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira Experience with external cloud platform such as OpenShift, AWS & GCP Experience with container technologies (Docker, Pivotal Cloud Foundry) and supporting frameworks (Kubernetes, OpenShift, Mesos) Experienced in integrating search solution with middleware & distributed messaging - Kafka Highly effective interpersonal and communication skills with tech/non-tech stakeholders. Experienced in software development life cycle and good problem-solving skills. Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environment Education: Bachelor’s/University degree or equivalent experience in computer science, engineering, or similar domain ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Architecture ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Python (Programming Language), Apache Spark, Google BigQuery Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Apache Spark, Python (Programming Language), Google BigQuery. - Strong understanding of data processing frameworks and their applications. - Experience in developing scalable applications using distributed computing. - Familiarity with cloud platforms and their integration with application development. Additional Information: - The candidate should have minimum 3 years of experience in PySpark. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Apache Spark. - Strong understanding of data processing and transformation techniques. - Familiarity with application development frameworks and methodologies. - Experience in debugging and troubleshooting application issues. Additional Information: - The candidate should have minimum 3 years of experience in PySpark. - This position is based at our Pune office. - A 15 years full time education is required., 15 years full time education
Posted 1 week ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Apache Spark Good to have skills : Java, Scala, PySpark Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing requirements, proposing solutions, and ensuring that the data platform aligns with organizational goals and standards. Your role will require you to stay updated with industry trends and best practices to contribute effectively to the team. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Engage in continuous learning to stay abreast of emerging technologies and methodologies. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with Java, Scala, PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration tools and techniques. - Familiarity with cloud platforms and services related to data engineering. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Kolkata office. - A 15 years full time education is required., 15 years full time education
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for sales professional for our client who work in EV battery segment. Responsibilities: Lead Generation & Prospecting: Identify and qualify leads through cold outreach, networking, and market intelligence. Be the spark that starts our customer journey. Relationship Building: Develop deep connections with decision-makers in fleet, logistics, and mobility companies. Understand their pain points and become their trusted vehicle & energy advisor. Consultative Selling: Pitch our cutting-edge energy stack with clarity and impact. Tailor solutions that fit customer needs and demonstrate ROI. Deal Closure: Negotiate commercial terms, handle objections, and close high-value deals — all while delivering value at every step. Customer Success & Growth: Nurture existing accounts, unlock upsell opportunities, and build long-term relationships. Market Insights: Stay ahead of trends in EVs, energy, and logistics — be the go-to person for competitive intelligence. Reporting & Analysis: Use data to improve — from pipeline health to deal velocity, share insights that drive better outcomes. Qualifications: Bachelor's degree in a relevant field, such as engineering, business etc, MBA is a plus. At least 2 years of proven experience in B2B sales, ideally within the commercial vehicles sector, last mile logistics or energy sector. Excellent communication, presentation, and interpersonal skills. Strong analytical and problem-solving abilities. Self-motivated and results-oriented with a strong work ethic. Understanding of logistics, electric vehicles (advantageous). Proficiency in Excel and powerpoint What matters: Quality of work Approach towards problem-solving Dissatisfaction towards mediocre work Resilient attitude to bounce back after failing
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. Overview As a leading global aerospace company, Boeing develops, manufactures and services commercial airplanes, defense products and space systems for customers in more than 150 countries. As a top U.S. exporter, the company leverages the talents of a global supplier base to advance economic opportunity, sustainability and community impact. Boeing’s team is committed to innovating for the future, leading with sustainability, and cultivating a culture based on the company’s core values of safety, quality and integrity. Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts – enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping people’s careers and being thoughtful about employee wellbeing. With us, you can create and contribute to what matters most in your career, community, country, and world. Join us in powering the progress of global aerospace. Boeing India IT Product Systems team is currently looking for an Associate Software Developer - Java full stack to join them in their team in Bangalore, India. This role will be based out of Bangalore, India . Position Responsibilities: Understands and develops software solutions to meet end user requirements. Ensures that application integrates with overall system architecture, utilizing standard IT lifecycle methodologies and tools. Develops algorithms, data and process models, plans interfaces and writes interface control documents for use in construction of solutions of moderate complexity. Employer will not sponsor applicants for employment visa status. Basic Qualifications (Required Skills/Experience): 2+ years of relevant experience in IT industry Experience in designing and implementing idiomatic RESTful APIs using the Spring framework (v6.0+) with Spring Boot (v3.0+) and Spring Security (v6.0+) in Java (v17+). Experience with additional languages (Scala/Kotlin/others) preferred. Working experience with RDBM Systems, basic SQL scripting and querying, specifically with SQL Server (2018+) and Teradata (v17+). Additional knowledge of schema / modelling / querying optimization preferred. Experience with Typescript (v5+), JavaScript (ES6+), Angular (v15+), Material UI, AmCharts (v5+) Experience working with ALM tools (Git, Gradle, SonarQube, Coverity, Docker, Kubernetes) driven by tests (JUnit, Mockito, Hamcrest etc.) Experience in shell scripting (Bash/Sh), CI/CD processes and tools (GitLab CI/similar) OCI containers (Docker/Podman/Buildah etc.) Data analysis and engineering experience with Apache Spark (v3+) in Scala, Apache Iceberg / Parquet etc. Experience with Trino/Presto is a bonus. Familiarity with GCP / Azure (VMs, container runtimes, BLOB storage solutions) preferred but not mandatory. Preferred Qualifications (Desired Skills/Experience) : A Bachelor’s degree or higher is preferred Strong backend experience (Java/Scala/Kotlin etc.) with basic data analysis/engineering experience (Spark/Parquet etc.) OR basic backend experience (Java/Scala etc.) with strong data analysis/engineering experience (Spark/Parquet etc.) OR Moderate backend experience (Java/Kotlin etc.) with Strong Frontend experience (Angular 15+ with SASS / Angular Material) and exposure to DevOps pipelines (GitLab CI) Typical Education & Experience: Bachelor's Degree with typically 2 to 5 years of experience OR Master's Degree with typically 1 to 2 years of experience is preferred but not required Relocation: This position does offer relocation within INDIA. Applications for this position will be accepted until Aug. 09, 2025 Export Control Requirements: This is not an Export Control position. Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India) Equal Opportunity Employer: We are an equal opportunity employer. We do not accept unlawful discrimination in our recruitment or employment practices on any grounds including but not limited to; race, color, ethnicity, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military and veteran status, or other characteristics covered by applicable law. We have teams in more than 65 countries, and each person plays a role in helping us become one of the world’s most innovative, diverse and inclusive companies. We are proud members of the Valuable 500 and welcome applications from candidates with disabilities. Applicants are encouraged to share with our recruitment team any accommodations required during the recruitment process. Accommodations may include but are not limited to: conducting interviews in accessible locations that accommodate mobility needs, encouraging candidates to bring and use any existing assistive technology such as screen readers and offering flexible interview formats such as virtual or phone interviews.
Posted 1 week ago
3.0 - 20.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Change Management and Transformation Consultant – Capital Markets Find endless opportunities to solve our clients' toughest challenges, as you work with exceptional people, the latest tech and leading companies across industries. Practice: Capital Markets, Industry Consulting, Capability Network I Areas of Work: Change Management and Transformation | Level: 11/9/7 / 6/5 | Location: Bengaluru/Gurugram/Mumbai| Years of Exp: 3-20 years Explore an Exciting Career at Accenture Are you an outcome-oriented problem solver? Do you enjoy working on transformation strategies for global clients? Does working in an inclusive and collaborative environment spark your interest? Then, Accenture Strategy and Consulting is the right place for you to explore limitless possibilities. The Practice – A Brief Sketch As a part of the Capital Markets practices within Accenture’s Capability Network, you will work with our global teams to help investment banks, asset and wealth managers, and exchanges, prepare for the digital future. Together, let’s leverage global strategies and data-driven insights to pave way for digital-enabled capital markets. Help us unlock new value in a disruptive world, with the following initiatives: Collaborate with client challenges to solve complex client problems such as regulatory reforms and implementation. Define and manage the organization change with reference to process, technology and organization structure. Manage transformation project to migrate from legacy to target. Assess as-is process and suggest best industry practices to come up with to-be processes and implement them to remove inefficiencies. Support data governance and management and help optimize operations and drive business decision-making. Support in development of collateral, methodology refinements, best practices updates and trends tracking, create and support proposals incorporating Accenture value proposition. Incorporate Accenture best practices and help develop methodologies into every stage of the project management lifecycle. Bring your best skills forward to excel in the role: Good analytical and problem-solving skills Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic consulting environment Read More About Us. Recent Blogs
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 6-10 years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Role Summary: RHOO-RegHub On Olympus is a Regulatory Reporting framework built on Olympus Tech Stack to centrally report all in-scope transactions, events and client reports on a single, scalable, cost-effective regulatory architecture that mitigates regulatory and reputational risk through delivery of complete, accurate and compliant reporting for various businesses. We are looking for a Big Data and AI Lead responsible for driving our organization's efforts in leveraging big data and artificial intelligence (AI) to achieve Regulatory objectives. This role involves developing and implementing strategies for data collection, storage, processing, analysis, and AI-driven applications. By embracing AI/ML, RegHub On Olympus (RHOO) will remain at the forefront of regulatory reporting technology and provide a competitive advantage in the financial industry. Required Skillset : The role requires 10+ yrs hands on experience in Java development. Working experience in designing systems with Low Latency Streaming Architectures Spark Spring Boot Kafka (ELK) Flink or any real-time streaming framework experience. Working Knowledge of NoSQL, Kafka, Big Data, MongoDB Hands-on experience on Hadoop, Big Data, Spark SQL, Hive/Impala. Experience in delivering Regulatory ask like in extremely compressed timelines Experience is JMS and Real time message processing on TIBCO. Experience using Jira, Bit Bucket and managing development/testing/release efforts. Experience using XML, FIX, POJO, JSON message format. Experience with Impala, Elastic & Oracle. Working knowledge on AWS-ECS- Exp on Code Build/Code Pipeline for CI/CD & CloudWatch for logging/Monitoring Hands on Amazon Simple Storage Service (S3). Added advantage of having hands-on experience with AI/ML technologies, including Python, Predictive Modeling, Natural Language Processing, Machine Learning Algorithms, and general data structure modules. Good to have knowledge on Order and Executions and Trade Life Cycle Events & knowledge to work with message formats like FPML and ISO20022. Ability to lead a team from the front and guide them through time sensitive milestones. Focus on leveraging regulatory delivery as a driver for the firm’s data strategy. Very good communication and interpersonal skills. Excellent relationships with senior tech, business and compliance partners. Experience with delivery expertise for large scale programs. Ability to align delivery with firm’s long-term strategy. Quick decision maker with the ability to take a grasp of the situation in case of an issue and mitigate the impact. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
13.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 13+ years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Working experience in designing systems with Low Latency Streaming Architectures Working Knowledge of NoSQL, Kafka, Big Data, MongoDB Hands-on experience on Hadoop, Big Data, Spark SQL, Hive/Impala. Experience in delivering Regulatory ask like in extremely compressed timelines Experience is JMS and Real time message processing on TIBCO. Experience using Jira, Bit Bucket and managing development/testing/release efforts. Experience using XML, FIX, POJO, JSON message format. Experience with Impala, Elastic & Oracle. Working knowledge on AWS-ECS- Exp on Code Build/Code Pipeline for CI/CD & CloudWatch for logging/Monitoring Hands on Amazon Simple Storage Service (S3). Added advantage of having hands-on experience with AI/ML technologies, including Python, Predictive Modeling, Natural Language Processing, Machine Learning Algorithms, and general data structure modules. Good to have knowledge on Order and Executions and Trade Life Cycle Events & knowledge to work with message formats like FPML and ISO20022. Ability to lead a team from the front and guide them through time sensitive milestones. Focus on leveraging regulatory delivery as a driver for the firm’s data strategy. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Analytics Lead Analyst is a strategic professional who stays abreast of developments within own field and contributes to directional strategy by considering their application in own job and the business. Recognized technical authority for an area within the business. Requires basic commercial awareness. There are typically multiple people within the business that provide the same level of subject matter expertise. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Significant impact on the area through complex deliverables. Provides advice and counsel related to the technology or operations of the business. Work impacts an entire area, which eventually affects the overall performance and effectiveness of the sub-function/job family. Responsibilities: Deep hands-on experience with PySpark for data processing, ETL (Extract, Transform, Load) operations, data manipulation, and building distributed computing solutions on large datasets. Proficiency in designing and building robust data pipelines, data ingestion, transformation, and processing workflows Solid understanding of data modeling principles, database design, and strong SQL skills for data querying and analysis. Ability to analyze data, identify patterns, uncover insights, and translate business needs into actionable data solutions. Leading and mentoring a team of data engineers or analysts, fostering best practices, and ensuring the delivery of high-quality data products. Working closely with product partners and business analysts, to understand requirements and deliver impactful analytical solutions. Qualifications: To be successful in this role, you should meet the following requirements: 8+ years of experience in handling distributed / big data projects. Proficiency in Pyspark, Linux scripting, SQL and Bigdata tools. Technology stack – Pyspark, ETL, Unix Shell Scripting, Python, Spark, SQL, Impala, Hive Strong exposure in interpretation of business requirements from a technical perspective. Design, develop and implement IT solutions that fulfill business users' requirements and conform to a high level of quality standard. Sound problem-solving skills and attention to detail. Strong communication, presentation and team collaboration skills. Knowledge of Automation and DevOps practices. Familiarity with agile development methodologies using Jira Education: Bachelor’s/University degree or equivalent experience, potentially Masters degree This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Analytics ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
A.P. Moller - Maersk A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains. Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too. We are responsible for moving 20 % of global trade & is on a mission to become the Global Integrator of Container Logistics. To achieve this, we are transforming into an industrial digital giant by combining our assets across air, land, ocean, and ports with our growing portfolio of digital assets to connect and simplify our customer’s supply chain through global end-to-end solutions, all the while rethinking the way we engage with customers and partners. The Brief As a Senior AI/ML Engineer in our Data & AI Governance team, you’ll build the systems that improve how Maersk detects, manages, and fixes data quality issues at scale while contributing to responsible AI observability and compliance tooling. This is a hands-on engineering role focused on platform-level tooling for data reliability, model traceability, and metadata intelligence. You’ll work across structured and unstructured data, help enforce quality SLAs and contribute to components that support the governance of AI/ML models. The role sits at the intersection of platform engineering, data operations, and applied AI - ideal for someone who enjoys building reusable tools, mentoring others, and making complex systems more reliable and auditable. This is a key part of our long-term vision to treat data quality with the same urgency and rigor as platform reliability. The systems you build will help set a new standard for how we manage quality, fairness, and trust in enterprise data and AI. Senior AI/ML Engineer Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com. What I'll be doing – your accountabilities? Build and scale AI/ML-driven components to detect data anomalies, schema drift, and degradation in real-time across pipelines Develop validation logic, auto-profiling tools, and scoring engines to assess and monitor enterprise data quality Design architecture for AI/ ML-based DQ solutions that are modular, reusable, and scalable Apply AI/ML techniques including NLP, rule induction, and pattern classification to enrich metadata and detect systemic quality issues Build tooling to support responsible AI: drift tracking, fairness detection, explainability indicators, and lifecycle logging Partner with platform engineers to integrate these tools into orchestration systems (e.g., Airflow, MLflow, or Dagster) Work with data owners and stewards to operationalize quality ownership using MIDAS – Maersk’s enterprise AI platform for metadata inventory, data accountability, and governance Contribute to the creation of a DataOps playbook with SLAs, page-zero metrics, escalation routines, and ownership models Mentor junior engineers and shape architectural and engineering best practices for AI/ML observability and data quality tooling Foundational Skills Expert-level Python engineering experience with a proven ability to ship AI/ML-backed tooling at production scale Advanced knowledge of data pipelines and orchestration frameworks (e.g., Airflow, Spark, Dagster) Expert understanding of system observability - logging, telemetry, health scoring applied to data and model workflows Proven track record of applying advanced AI/ML techniques (e.g., classification, clustering, anomaly detection) in production settings Strong grounding in solution architecture for data-intensive, distributed systems Specialized Skills Deep experience applying AI/ML to data quality use cases such as profiling, anomaly detection, drift analysis, and schema inference Expertise in metadata management, lineage tracing, and automated documentation (e.g., via DataHub, Unity Catalog, or Collibra) Hands-on experience with responsible AI tooling (e.g., SHAP, LIME, Fairlearn, What-If Tool) for explainability and bias detection Built or contributed to platform-level components that are used across domains, not just in isolated project delivery Ability to design and implement architectural patterns that support federated ownership, reuse, and lifecycle transparency Eagerness to learn and contribute to AI governance frameworks (e.g., EU AI Act, ISO 42001, NIST AI RMF) and translate those into engineering patterns Qualifications & Requirements 8+ years of engineering experience, including at least 3 years building and deploying AI/ML solutions in production Demonstrated experience building DQ and model observability tools - not just core predictive systems Strong experience working in cross-functional platform teams that deliver shared services used across business units Fluent in MLOps tooling (e.g., MLflow, SageMaker, Vertex AI) and capable of versioning, tracking, and documenting model behavior Strong communication and documentation skills; able to make complex system behavior understandable and operable Passion for enabling trustworthy AI through high-quality engineering practices Preferred Experiences In addition to basic qualifications, would be great if you have… Experience implementing data quality scoring, monitoring, or root cause tooling in a production environment Experience working with shared metadata systems and operationalizing lineage or traceability at scale Strong involvement in platform teams or developer enablement functions - not just analytics or research delivery Applied experience with model explainability, fairness evaluation, or lifecycle documentation tooling Understanding of enterprise AI risk and how to translate policy into engineering design constraints
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process – through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services) Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects; Experience with time-series/analytics dB’s such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF’s for analysing semi structured/structured datasets Location : Pune and Bangalore You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 1 week ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As part of the Data team within HR, the BCG Global Data Stewards are responsible for the design and implementation of an enterprise-wide Data Governance strategy that supports the efficient usage of data across the firm by ensuring key sets of fully governed, curated, and globally integrated data products & assets. Under the guidance of the Global Data Owner, each Global Data Steward supports key data initiatives (both functional and technical) in the domain, including defining and executing principles, harmonized definitions, quality assessments, and governance processes to ensure safe usage and evolution of enterprise-wide products and assets. Specifically, the Compensation Data Steward will develop a data governance strategy for BCG’s Compensation data domain. This role is embedded within a Compensation Transformation project, and this role will work to ensure that compensation data governance aligns with broader transformation objectives. You will collaborate with HR, Data, Finance, and Enterprise Services leaders, as well as their corresponding Business Intelligence, Operations, and Technology teams, to ensure a strong, compliant, and strategically valuable Compensation data ecosystem. The Compensation domain includes salary structures, bonus programs, pay equity analytics, and long-term incentive plans across BCG. It supports BCG’s ability to make informed, strategic, and equitable compensation decisions for employees across the globe. This function is critical in ensuring compliance with global and local pay regulations, supporting fairness and transparency, and enabling data-driven insights into compensation trends and workforce planning. The ideal candidate will have a goal-oriented mindset and enjoy working with cross-functional teams to deliver Data Governance capabilities and support exceptional Data Products. They will have 8+ years of experience in Data Governance, preferably in Compensation, HR, or Finance data, and be well-versed in compensation data structures, analytics, and regulatory compliance challenges. Additionally, they should have strong communication skills with proven experience in stakeholder engagement, including working with senior business and technical teams to showcase the business benefits of Data Governance. Among Your Responsibilities, You Will Define an overall vision, roadmap, and priorities for our Compensation Data Governance Strategy Work closely with the Global HR Data Governance Director, the HR Data Owner, and other Global Data Stewards (e.g., Worker and Candidate data) to develop a Compensation data strategy Understand end-to-end Compensation data flows across BCG, including salary structures, pay adjustments, incentives, and compliance requirements Gather business requirements from Compensation, HR, and Finance stakeholders to identify data gaps and opportunities Align the Compensation Data Governance strategy with broader BCG HR, Finance, and Data leadership priorities Create and enforce data quality policies and standards to maintain compliance and support decision-making Ensure that data structures, processes, and governance mechanisms support Compensation Transformation objectives Work with HR Technology and IT teams to integrate governance controls into new Compensation systems and platforms Ensure sustainable delivery of customer value on the agreed Data Governance roadmap Collaborate with other Global Data Stewards and IT teams to ensure consistency across BCG’s Data Governance strategy Foster collaboration with Compensation, HR, Finance, and Data teams to ensure alignment on data opportunities and priorities Provide training and guidance on Compensation data best practices, compliance, and governance frameworks Ensure that areas of priority are identified (e.g. Metadata Management, Data Lineage, Data Quality) and a cohesive action plan is developed to ensure impactful value-add to end customers Clear narrative and articulation of Data Governance principles linked to business value-add, communicating these regularly to senior stakeholders across Data and HR. Track and report on specific and measurable aligned KPIs and key results, developing metrics to measure maturing with Career Development data Prioritize identified data opportunities clearly communicating with required stakeholders to ensure efficient delivery Proactively identify and escalate risks and mitigation plans, along with key decisions, to relevant stakeholders Engage with the Data Governance community to ensure alignment and best practices Collaborate with other Global Data Stewards and IT teams to ensure consistency across BCG’s Data Governance strategy Foster collaboration with Compensation, HR, Finance, and Data teams to ensure alignment on data opportunities and priorities Provide training and guidance on Compensation data best practices, compliance, and governance frameworks What You'll Bring Bachelor’s or higher degree in Computer Science, Mathematics, Statistics, Finance, HR, or related fields 8+ years of experience in Data Governance, Compensation Data Management, HR Data Analytics, or Finance Data Strong knowledge of total Compensation data concepts, benefits, base, bonus, pay structures, incentives, and compliance Understanding of Data Governance frameworks (DAMA DMBoK, EDM Council’s DCAM) is beneficial Hands-on experience using Data Governance tools (e.g., Collibra, Talend, Informatica) Experience with Compensation systems and data platforms such as Workday, SAP, or Snowflake Strong stakeholder engagement and communication skills to collaborate with diverse, global teams Who You'll Work With BCG Global HR and Data teams: the HR Data Product Portfolio, Data Governance CoE, Master Data Mgmt., Enterprise Data Modelling, and Data Product development teams BCG HR, Finance and Data teams; business intelligence and analytics, operations, and technical teams, among other business functions BCG Leadership: Heads of HR, Recruiting, Finance, and Enterprise Services among other business functions. The broader Compensation Transformation project team The Data Governance network: Data Owners, Stewards, and Data Governance Directors Additional info YOU’RE GOOD AT Defining and implementing a global Compensation Data Governance framework and ensuring data quality, security, and compliance Understanding Compensation data structures, pay equity analytics, and salary frameworks Leading Compensation data-related governance initiatives, including Data Lineage, Master Data Management (MDM), Data Quality, and Data Architecture Partnering with HR, Finance, and IT teams to prioritize Compensation data initiatives. Developing Compensation data analytics, metrics, and dashboards to track governance maturity Ensuring adherence to relevant global and local data policies, regulations, and standards Communicating with senior executive teams about the importance of Compensation data strategy Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Who We Are Our Global HR Shared Services Center (HRSSC), located across three global hubs—India, Costa Rica, and Portugal—deliver centralized and efficient support for HR processes worldwide. By working here, you’ll be part of our team that’s transforming how we deliver world-class HR services to our employees, globally. We support the full employee lifecycle with precision, enable efficiency gains through smart systems and collaboration, whilst delivering measurable outcomes that enhance every employee’s journey at BCG. You will be a key member of our Global HR Shared Services Center (HRSSC), supporting regional and local HR teams and employees worldwide with administrative HR processes. You’ll collaborate with colleagues across multiple geographies and time zones, forming part of a close-knit global HR network that values teamwork, ownership, and continuous learning. Key Responsibilities Include Preparing and processing employee paperwork for new hires, promotions, transfers, exits, and changes. Maintaining personnel records in compliance with legal requirements and internal standards. Supporting onboarding and background verification including induction plans and welcome communications. Managing employee documentation requests including verification letters, references, and visa invitation letters. Delivering reporting on employee data (e.g. distribution lists, anniversaries, milestones). Supporting internal audits with required documentation and timely response. What You'll Bring A graduation degree. ~1–3+ years of relevant experience in HR operations, shared services, or a process-driven role. Familiarity with Workday (preferred) or other HR ERP systems. Proficiency in Microsoft Office (Excel, PowerPoint, Outlook, Word, Visio). Experience working in a professional services or multinational environment. Fluent verbal and written English language skills are required. Proficiency in Mandarin (both spoken and written) is also essential, as this role involves supporting China Who You'll Work With Be part of a respected global brand that invests in its people. Exposure to world-class HR systems, like Workday. Work in a culture that prioritizes learning, diversity, and inclusion. Join a growing team where your work directly drives global impact. Additional info You’re Good At Thriving under pressure with exceptional attention to detail. Staying flexible and reliable in a dynamic and changing environment. Managing multiple tasks with structure and discipline. Handling sensitive data with confidentiality and professionalism. Communicating clearly and professionally, both in writing and speech. Creating meaningful experiences for every customer through exceptional service. Collaborating across cultures and time zones. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075
Posted 1 week ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France