Jobs
Interviews

8236 Hadoop Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 10.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

At EY, you will have the opportunity to shape a career as unique as you are, supported by a global network, inclusive culture, and cutting-edge technology to help you reach your full potential. Your individual perspective and voice are valued to contribute to the continuous improvement of EY. By joining us, you can create an outstanding experience for yourself while contributing to a more efficient and inclusive working world for all. As a Data Engineering Lead, you will work closely with the Data Architect to design and implement scalable data lake architecture and data pipelines. Your responsibilities will include designing and implementing scalable data lake architectures using Azure Data Lake services, developing and maintaining data pipelines for data ingestion from various sources, optimizing data storage and retrieval processes for efficiency and performance, ensuring data security and compliance with industry standards, collaborating with data scientists and analysts to enhance data accessibility, monitoring and troubleshooting data pipeline issues to ensure reliability, and documenting data lake designs, processes, and best practices. You should have experience with SQL and NoSQL databases, as well as familiarity with big data file formats such as Parquet and Avro. **Roles and Responsibilities:** **Must Have Skills:** - Azure Data Lake - Azure Synapse Analytics - Azure Data Factory - Azure DataBricks - Python (PySpark, Numpy, etc.) - SQL - ETL - Data warehousing - Azure DevOps - Experience in developing streaming pipelines using Azure Event Hub, Azure Stream Analytics, Spark streaming - Experience in integrating with business intelligence tools such as Power BI **Good To Have Skills:** - Big Data technologies (e.g., Hadoop, Spark) - Data security **General Skills:** - Experience with Agile and DevOps methodologies and the software development lifecycle - Proactive and accountable for deliverables - Ability to identify and escalate dependencies and risks - Proficient in working with DevOps tools with limited supervision - Timely completion of assigned tasks and regular status reporting - Capability to train new team members - Desired knowledge of cloud solutions like Azure or AWS with DevOps/Cloud certifications - Ability to work effectively with multicultural global teams and virtually - Strong relationship-building skills with project stakeholders Join EY in its mission to build a better working world by creating long-term value for clients, people, and society, and fostering trust in the capital markets. Leveraging data and technology, diverse EY teams across 150+ countries provide assurance and support clients in growth, transformation, and operations across various sectors. Through its services in assurance, consulting, law, strategy, tax, and transactions, EY teams strive to address complex global challenges by asking insightful questions to discover innovative solutions.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As a ClickHouse Database Specialist, you will be responsible for helping build production-grade systems based on ClickHouse. This includes advising on how to design schemas, plan clusters, and work on infrastructure projects related to ClickHouse. You will be working on diverse environments ranging from single node setups to clusters with hundreds of nodes. Additionally, you will be involved in improving ClickHouse itself by fixing bugs, enhancing documentation, creating test cases, and studying new usage patterns, functions, and integration with other products. Your role will also entail the installation, configuration, backup, recovery, and maintenance of multiple node clusters in ClickHouse database. Monitoring and optimizing database performance to ensure high availability and responsiveness will be a key aspect of your responsibilities. Troubleshooting database issues, identifying and resolving performance bottlenecks, designing and implementing backup and recovery strategies, and developing database security policies and procedures will be part of your daily tasks. Collaborating with development teams to optimize database schema design and queries, providing technical guidance and support to development and operations teams, and handling support calls from customers using ClickHouse will be crucial components of this role. Furthermore, having experience with big data stack components like Hadoop, Spark, Kafka, Nifi, as well as data science and data analysis, will be beneficial. Knowledge of SRE/DevOps stacks, monitoring, and system management tools such as Prometheus, Ansible, ELK, and version control using git are also desired skills for this position. In summary, as a ClickHouse Database Specialist, you will play a vital role in ensuring the efficient operation and optimization of ClickHouse databases, contributing to the overall success of production-grade systems and infrastructure projects.,

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As a Data Engineer, you will be responsible for collaborating with data scientists, software engineers, and business stakeholders to comprehend data requirements and develop efficient data models. Your key tasks will include designing and implementing robust data pipelines, ETL processes, and data integration solutions to ensure scalability and reliability. You will play a crucial role in the extraction, transformation, and loading of data from multiple sources while maintaining data quality, integrity, and consistency. It will be essential to optimize data processing and storage systems to effectively handle large volumes of structured and unstructured data. Additionally, you will be expected to conduct data cleaning, normalization, and enrichment activities to prepare datasets for analysis and modeling purposes. Monitoring data flows and processes, as well as identifying and resolving data-related issues and bottlenecks, will be part of your daily responsibilities. Furthermore, contributing to enhancing data engineering practices and standards within the organization and staying abreast of industry trends and emerging technologies will be vital for your success in this role. In terms of qualifications, you should possess a strong passion for data engineering, artificial intelligence, and problem-solving. A solid understanding of data engineering concepts, data modeling, and data integration techniques is essential. Proficiency in programming languages like Python, SQL, and Web Scraping is required, and familiarity with databases such as NoSQL, relational databases, and technologies like MongoDB, Redis, and Apache Spark would be advantageous. Knowledge of distributed computing frameworks and big data technologies, such as Hadoop and Spark, will be considered a plus. Excellent analytical and problem-solving skills, attention to detail, and strong communication and collaboration abilities are also key attributes for this role. Being self-motivated, a quick learner, and adaptable to changing priorities and technologies are qualities that will help you excel in this position.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

As a Solution Architect, you will be responsible for driving the design and development of solutions that are highly scalable, stable, and secure modern applications. Your main responsibilities will include designing and developing modern applications with modular, loosely coupled design, and deployable on a cloud-native architecture. You will implement desired SRE practices to seamlessly manage applications in production using observability, monitoring, and automation. Additionally, you will collaborate with stakeholders to gather business and functional requirements and translate them into technical specifications based on defined architectures. You will lead the preparation of detailed design specifications to guide the development, modification, and enhancement of applications. It will be your responsibility to define integration patterns, map interdependencies of business requirements to solution building blocks using architectural best practices, and establish standards and robust processes. Ensuring design consistency across the organization, reducing repeated work, and promoting the adoption and implementation of defined solutions will be crucial aspects of your role. You will also recommend designs that consider current applications and architecture, operating models, and the end target state architecture. Developing and documenting solution specifications that serve as a reference for implementation will be part of your responsibilities. Collaboration with different engineering teams to develop and agree on system integration standards is another key aspect of this role. To be successful as a Solution Architect, you should have a minimum of 8-12 years of hands-on experience using modeling tools for process and end-to-end solution design. In-depth industry and academic knowledge, along with proven experience in process analysis and design, are essential. Proficiency in at least one major programming language such as Java, C#, or Python is required. Additionally, you should have good knowledge of web and mobile development standards and database technologies like Spring Boot, Node.js, MariaDB/MySQL, PostgreSQL, and the Hadoop ecosystem. A solid understanding of DevOps, SRE, and Agile methodology, as well as tools like Maven, Jcube, Nexus, and Cucumber, is necessary. Proficiency in tools such as Git, Bitbucket, Jenkins, Artifactory, and Nexus is also expected. Strong understanding of distributed systems, knowledge and experience in integrating data and applications with business processes, policies, and regulations, as well as familiarity with risk and governance concepts are important qualifications. You should possess highly organized and structured thinking capabilities, along with the ability to understand and synthesize unstructured information. Analytical thinking, combined with strong communication skills, is essential for engaging with geographically diverse teams. Knowledge around business process modeling, information systems analysis and design, enterprise technology, and data modeling will further enhance your suitability for this role.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled and experienced Senior Data Scientist with a strong background in Artificial Intelligence (AI) and Machine Learning (ML). You will be joining our team as an innovative, analytical, and collaborative team player with a proven track record in end-to-end AI/ML project delivery. This includes expertise in data processing, modeling, and model deployment. With a minimum of 5-7 years of experience in data science, your focus will be on AI/ML applications. Your technical skills will include proficiency in a wide range of ML algorithms such as regression, classification, clustering, decision trees, neural networks, and deep learning architectures (e.g., CNNs, RNNs, GANs). Strong programming skills in Python, R, or Scala are required, along with experience in ML libraries like TensorFlow, PyTorch, and Scikit-Learn. You should have experience in data wrangling, cleaning, and feature engineering, with familiarity in SQL and data processing frameworks like Apache Spark. Model deployment using tools like Docker, Kubernetes, and cloud services (AWS, GCP, or Azure) should be part of your skill set. A strong foundational knowledge in statistics, probability, and mathematical concepts used in AI/ML is essential, along with proficiency in data visualization tools such as Tableau, Power BI, or matplotlib. Preferred qualifications include familiarity with big data tools like Hadoop, Hive, and distributed computing. Hands-on experience in NLP techniques like text mining, sentiment analysis, and transformers is a plus. Expertise in analyzing and forecasting time-series data, as well as familiarity with CI/CD pipelines for ML, model versioning, and performance monitoring, are also preferred. Leadership skills such as leading cross-functional project teams or managing data science projects in a production setting are valued. Your personal attributes should include problem-solving skills to break down complex problems and design innovative, data-driven solutions. Strong written and verbal communication skills are necessary to convey technical insights clearly to diverse audiences. A keen interest in staying updated with the latest advancements in AI and ML, along with the ability to quickly learn and implement new technologies, is expected from you.,

Posted 1 day ago

Apply

5.0 years

0 Lacs

Haryana, India

On-site

Job Description About TaskUs: TaskUs is a provider of outsourced digital services and next-generation customer experience to fast-growing technology companies, helping its clients represent, protect and grow their brands. Leveraging a cloud-based infrastructure, TaskUs serves clients in the fastest-growing sectors, including social media, e-commerce, gaming, streaming media, food delivery, ride-sharing, HiTech, FinTech, and HealthTech. The People First culture at TaskUs has enabled the company to expand its workforce to approximately 45,000 employees globally. Presently, we have a presence in twenty-three locations across twelve countries, which include the Philippines, India, and the United States. It started with one ridiculously good idea to create a different breed of Business Processing Outsourcing (BPO)! We at TaskUs understand that achieving growth for our partners requires a culture of constant motion, exploring new technologies, being ready to handle any challenge at a moment's notice, and mastering consistency in an ever-changing world. What We Offer: At TaskUs, we prioritize our employees' well-being by offering competitive industry salaries and comprehensive benefits packages. Our commitment to a People First culture is reflected in the various departments we have established, including Total Rewards, Wellness, HR, and Diversity. We take pride in our inclusive environment and positive impact on the community. Moreover, we actively encourage internal mobility and professional growth at all stages of an employee's career within TaskUs. Join our team today and experience firsthand our dedication to supporting People First. Job Description Summary Data Scientist with deep expertise in modern AI/ML technologies to join our innovative team. This role combines cutting-edge research in machine learning, deep learning, and generative AI with practical full-stack cloud development skills. You will be responsible for architecting and implementing end-to-end AI solutions, from data engineering pipelines to production-ready applications leveraging the latest in agentic AI and large language models. Job Description Key Responsibilities AI/ML Development & Research Design, develop, and deploy advanced machine learning and deep learning models for complex business problems Implement and optimize Large Language Models (LLMs) and Generative AI solutions Build agentic AI systems with autonomous decision-making capabilities Conduct research on emerging AI technologies and their practical applications Perform model evaluation, validation, and continuous improvement Cloud Infrastructure & Full-Stack Development Architect and implement scalable cloud-native ML/AI solutions on AWS, Azure, or GCP Develop full-stack applications integrating AI models with modern web technologies Build and maintain ML pipelines using cloud services (SageMaker, ML Engine, etc.) Implement CI/CD pipelines for ML model deployment and monitoring Design and optimize cloud infrastructure for high-performance computing workloads Data Engineering & Database Management Design and implement data pipelines for large-scale data processing Work with both SQL and NoSQL databases (PostgreSQL, MongoDB, Cassandra, etc.) Optimize database performance for ML workloads and real-time applications Implement data governance and quality assurance frameworks Handle streaming data processing and real-time analytics Leadership & Collaboration Mentor junior data scientists and guide technical decision-making Collaborate with cross-functional teams including product, engineering, and business stakeholders Present findings and recommendations to technical and non-technical audiences Lead proof-of-concept projects and innovation initiatives Required Qualifications Education & Experience Master's or PhD in Computer Science, Data Science, Statistics, Mathematics, or related field 5+ years of hands-on experience in data science and machine learning 3+ years of experience with deep learning frameworks and neural networks 2+ years of experience with cloud platforms and full-stack development Technical Skills - Core AI/ML Machine Learning: Scikit-learn, XGBoost, LightGBM, advanced ML algorithms Deep Learning: TensorFlow, PyTorch, Keras, CNN, RNN, LSTM, Transformers Large Language Models: GPT, BERT, T5, fine-tuning, prompt engineering Generative AI: Stable Diffusion, DALL-E, text-to-image, text generation Agentic AI: Multi-agent systems, reinforcement learning, autonomous agents Technical Skills - Development & Infrastructure Programming: Python (expert), R, Java/Scala, JavaScript/TypeScript Cloud Platforms: AWS (SageMaker, EC2, S3, Lambda), Azure ML, or Google Cloud AI Databases: SQL (PostgreSQL, MySQL), NoSQL (MongoDB, Cassandra, DynamoDB) Full-Stack Development: React/Vue.js, Node.js, FastAPI, Flask, Docker, Kubernetes MLOps: MLflow, Kubeflow, Model versioning, A/B testing frameworks Big Data: Spark, Hadoop, Kafka, streaming data processing Preferred Qualifications Experience with vector databases and embeddings (Pinecone, Weaviate, Chroma) Knowledge of LangChain, LlamaIndex, or similar LLM frameworks Experience with model compression and edge deployment Familiarity with distributed computing and parallel processing Experience with computer vision and NLP applications Knowledge of federated learning and privacy-preserving ML Experience with quantum machine learning Expertise in MLOps and production ML system design Key Competencies Technical Excellence Strong mathematical foundation in statistics, linear algebra, and optimization Ability to implement algorithms from research papers Experience with model interpretability and explainable AI Knowledge of ethical AI and bias detection/mitigation Problem-Solving & Innovation Strong analytical and critical thinking skills Ability to translate business requirements into technical solutions Creative approach to solving complex, ambiguous problems Experience with rapid prototyping and experimentation Communication & Leadership Excellent written and verbal communication skills Ability to explain complex technical concepts to diverse audiences Strong project management and organizational skills Experience mentoring and leading technical teams How We Partner To Protect You: TaskUs will neither solicit money from you during your application process nor require any form of payment in order to proceed with your application. Kindly ensure that you are always in communication with only authorized recruiters of TaskUs. DEI: In TaskUs we believe that innovation and higher performance are brought by people from all walks of life. We welcome applicants of different backgrounds, demographics, and circumstances. Inclusive and equitable practices are our responsibility as a business. TaskUs is committed to providing equal access to opportunities. If you need reasonable accommodations in any part of the hiring process, please let us know. We invite you to explore all TaskUs career opportunities and apply through the provided URL https://www.taskus.com/careers/ . TaskUs is proud to be an equal opportunity workplace and is an affirmative action employer. We celebrate and support diversity; we are committed to creating an inclusive environment for all employees. TaskUs people first culture thrives on it for the benefit of our employees, our clients, our services, and our community. Req Id: R_2507_10290_0 Posted At: Thu Jul 31 2025 00:00:00 GMT+0000 (Coordinated Universal Time)

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

About Marriott: Marriott Tech Accelerator is part of Marriott International, a global leader in hospitality. Marriott International, Inc. is a leading American multinational company that operates a vast array of lodging brands, including hotels and residential properties. It consists of over 30 well-known brands and nearly 8,900 properties situated in 141 countries and territories. Role Title: Security Data Scientist Position Summary: Marriott International's Global Information Security is seeking an experienced Security Data Scientist who can combine expertise in cybersecurity with data science skills to analyze and protect Marriott's digital assets. Job Responsibilities: Perform data cleaning, analysis, and modeling tasks. Work under guidance of senior team members to: Analyze large datasets related to cybersecurity threats and incidents. Implement existing machine learning models and algorithms to detect anomalies and potential security breaches. Support SDL tools (e.g., big data, ML/AI technologies). Create data visualizations and reports to communicate insights to stakeholders. Collaborate with cybersecurity teams to implement data-driven security solutions. Stay up to date with the latest cyber threats and data science techniques. Help to maintain and document SDL MLOps processes and procedures. Skill and Experience: 2-4 years of data science, data analytics, data management, and/or information security experience that includes: 2+ years of experience in data science/data analytics in an enterprise environment. 1+ years of experience in information protection/information security. Strong background in statistics, mathematics, and software engineering (e.g., Proficiency in Python, R). Experience with machine learning algorithms and frameworks as well as AI techniques. Knowledge of cybersecurity principles, tools, and best practices. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies. Understanding of data visualization tools like Power BI. Preferred: Programming languages: Python, R, SQL. Machine learning frameworks: TensorFlow, PyTorch, scikit-learn. Big data technologies: Hadoop, Spark, and Kafka. Cloud platforms: AWS, Azure, GCP. Data visualization tools: Tableau, Power BI. Relevant certifications such as data science certifications, CISSP, CEH. Verbal and written communication skills. Education and Certifications: Bachelor's degree in computer/data science, information management, Cybersecurity, or related field or equivalent experience/certification. Work location: Hyderabad, India. Work mode: Hybrid.,

Posted 1 day ago

Apply

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Architect, you will be responsible for leading, analyzing, designing, and delivering analytics solutions and applications, including statistical data models, reports, and dashboards in cloud environments such as AWS, Azure, and GCP, as well as corresponding cloud-based EDW database platforms like Snowflake, Redshift, and BigQuery. You must have a minimum of 8 years of experience, with at least 3 years in the role of a data architect for Data Warehouse and Analytics solutions. Your role will involve leveraging your 3+ years of experience with cloud platforms (AWS, Azure, GCP) and a strong understanding of the ingestion and consumption processes in Data Lakes. You should also have 3+ years of experience in cloud-based EDW platforms such as Snowflake, Redshift, BigQuery, or Synapse, and be adept at building and launching new data models that provide intuitive analytics for analysts and customers. In this position, you will be expected to work with and analyze large datasets within the relevant domains of enterprise data, as well as demonstrate strong experience in Data Warehouse ETL design and development, methodologies, tools, processes, and best practices. Proficiency in writing complex SQL, PL/SQL, UNIX scripts, and understanding of performance tuning and troubleshooting aspects are also crucial aspects of this role. Furthermore, you should possess good communication and presentation skills, with a proven track record of using insights to influence executives and colleagues. Additionally, having awareness or expertise in data security, data access controls, DevOps tools, and development frameworks like SCRUM/Agile will be beneficial. Your responsibilities will also include recommending solutions to improve cloud and existing Datawarehouse solutions, as well as showcasing the new capabilities of advanced analytics to business and technology teams to demonstrate the potential of the Data platform. Overall, your leadership abilities will be essential in driving cross-functional development on new solutions from design through delivery. (ref: hirist.tech),

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Software Engineer for Financial Crime Solutions (Crypto), you will have the opportunity to play a crucial role in the development and leadership of a powerful risk assessment tool called Crypto Secure. This tool aims to provide banks with enhanced visibility of crypto spend, transaction volumes, and AML risk exposure through a user-friendly dashboard. Your responsibilities will include leading technical execution and delivery by designing, developing, and maintaining scalable applications using Java, Spring Boot, React, PostgresDB, Apache NiFi, Hadoop, Snowflake, and other modern technologies. You will also be involved in breaking down high-level requirements into technical solutions, driving technical decision-making, and ensuring alignment with architecture and security best practices. Additionally, you will mentor and guide the team, conduct code reviews, and foster a culture of continuous learning and innovation. Collaboration across teams will be essential as you partner with Product Managers and System Architects to align technical and business priorities, work closely with Quality Engineers and Business Analysts, and support project managers in managing technical dependencies. Furthermore, you will champion engineering excellence by advocating for clean, maintainable, and testable code, staying updated with industry trends, promoting DevOps and CI/CD best practices, and driving the adoption of accessibility, internationalization, and performance best practices. To excel in this role, you should have 8+ years of experience in developing enterprise-grade applications, strong full-stack experience with Java, Spring Boot, React, and relational databases, as well as knowledge of big data technologies and cloud platforms. Your leadership and collaboration skills, along with an ownership mentality, detail-oriented approach, and passion for innovation, will be key to your success. This role offers you the opportunity to lead a talented team, influence the technical direction of a critical security product, work on challenging problems at the intersection of crypto risk, finance, and cutting-edge technology, and have a direct impact on product success. Additionally, you will have the chance to collaborate with experienced Software Architects, deepen your understanding of software architecture and design principles, and develop the skills needed for future leadership roles. If you are looking to be part of a dynamic team where your ideas matter and your work makes a difference, this role could be the perfect fit for you.,

Posted 1 day ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Aruba is an HPE Company, and a leading provider of next-generation network access solutions for the mobile enterprise. Helping some of the largest companies in the world modernize their networks to meet the demands of a digital future, Aruba is redefining the “Intelligent Edge” – and creating new customer experiences across intelligent spaces and digital workspaces. Join us redefine what’s next for you. How You Will Make Your Mark… The ideal candidate will have experience working with AI technologies including LLMs/GenAI, and application development with to build and deploy AI Chat bot to support business management. Experience with MS Power Platform, Java and Databricks are preferred. Responsibilities What you’ll do: As a Sr. AI Developer, the primary responsibility will be on full-stack development of AI Chat bot application for business management, integrating business-relevant data with LLMs, and helping the team deliver incremental features for on-demand AI-assisted analytics services on a hybrid tech stack. Translate business requirements into scalable and performant technical solutions. Design, code, test, and assure the quality of complex AI-powered product features. Partner with a highly motivated and talented set of colleagues. Be a motivated, self-starter who can operate with minimal handholding. Collaborate across teams and time zones, demonstrating flexibility and accountability. Education And Experience Required 8-10+ years of Data Engineering & AI Development experience, with significant exposure to building AI Chat bots on a hybrid tech stack across SQL Server, Hadoop, Azure Data Factory and Databricks. Advanced university degree (e.g., Masters) or demonstrable equivalent. What You Need To Bring Knowledge and Skills: Demonstrated ability to build or integrate AI-driven features into enterprise applications. Strong knowledge of Computer Science fundamentals. Experience with SQL databases and building SSIS packages; knowledge of NoSQL and event streaming (e.g., Kafka) is a bonus. Experience working with LLMs and generative AI frameworks (e.g., OpenAI, Hugging Face, etc.). Proficiency in MS Power Platform, Java, Scala, Python experience preferred. Experience with SAP software (e.g., SAP S/4HANA, SAP BW) is an asset. Proven track record of writing production-grade code for enterprise-scale systems. Knowledge of Agentic AI and frameworks Strong collaboration and communication skills. Experience using tools like JIRA for tracking tasks and bugs, with Agile CI/CD workflows. Strong domain experience across Sales, Finance or Operations with deep understanding of key KPIs & Metrics. Collaborates with senior managers/directors of the business on AI Chat bot, BI, Data Science and Analytics roadmap. Owns business requirements, prioritization & execution to deliver actionable insights to enable decision making, support strategic initiatives and accelerate profitable growth. Functions as the subject matter expert for data, analytics, and reporting systems within the organization to yield accurate and proper interpretation of core business KPIs/metrics. Performing deep-dive investigations, including applying advanced techniques, to solve some of the most critical and complex business problems in support of business transformation to enable Product, Support, and Software as a Service offerings. Additional Skills Accountability, Accountability, Active Learning (Inactive), Active Listening, Bias, Business Decisions, Business Development, Business Metrics, Business Performance, Business Strategies, Calendar Management, Coaching, Computer Literacy, Creativity, Critical Thinking, Cross-Functional Teamwork, Design Thinking, Empathy, Follow-Through, Growth Mindset, Intellectual Curiosity (Inactive), Leadership, Long Term Planning, Managing Ambiguity, Personal Initiative {+ 5 more} What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #aruba Job Business Planning Job Level Expert HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

This is a data engineer position - a programmer responsible for the design, development implementation and maintenance of data flow channels and data processing systems that support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. The overall objective is defining optimal solutions to data collection, processing, and warehousing. Must be a Spark Java development expertise in big data processing, Python and Apache spark particularly within banking & finance domain. He/She designs, codes and tests data systems and works on implementing those into the internal infrastructure. Responsibilities: Ensuring high quality software development, with complete documentation and traceability Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance Ensure efficient data storage and retrieval using Big Data Implement best practices for spark performance tuning including partition, caching and memory management Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins) Work on batch processing frameworks for Market risk analytics Promoting unit/functional testing and code inspection processes Work with business stakeholders and Business Analysts to understand the requirements Work with other data scientists to understand and interpret complex datasets Qualifications: 5- 8 Years of experience in working in data eco systems. 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks. 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning) Experienced in working with large and multiple datasets and data warehouses Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets. Strong analytic skills and experience working with unstructured datasets Ability to effectively use complex analytical, interpretive, and problem-solving techniques Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira Experience with external cloud platform such as OpenShift, AWS & GCP Experience with container technologies (Docker, Pivotal Cloud Foundry) and supporting frameworks (Kubernetes, OpenShift, Mesos) Experienced in integrating search solution with middleware & distributed messaging - Kafka Highly effective interpersonal and communication skills with tech/non-tech stakeholders. Experienced in software development life cycle and good problem-solving skills. Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environment Education: Bachelor’s/University degree or equivalent experience in computer science, engineering, or similar domain ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Architecture ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 day ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Aruba is an HPE Company, and a leading provider of next-generation network access solutions for the mobile enterprise. Helping some of the largest companies in the world modernize their networks to meet the demands of a digital future, Aruba is redefining the “Intelligent Edge” – and creating new customer experiences across intelligent spaces and digital workspaces. Join us redefine what’s next for you. How You Will Make Your Mark… The ideal candidate will have experience with deploying and managing enterprise-scale Data Governance practices along with Data Engineering experience developing the database layer to support and enable AI initiatives as well as streamlined user experience with Data Discovery, Security & Access Control, for meaningful & business-relevant analytics. The candidate will be comfortable with the full stack analytics ecosystem, with Database layer, BI dashboards, and AI/Data Science models & solutions, to effectively define and implement a scalable Data Governance practice. What You’ll Do Responsibilities: Drive the design and development of Data Dictionary, Lineage, Data Quality, Security & Access Control for Business-relevant data subjects & reports across business domains. Engage with the business users community to enable ease of Data Discovery and build trust in the data through Data Quality & Reliability monitoring with key metrics & SLAs defined. Supports the development and sustaining of Data subjects in the Database layer to enable BI dashboards and AI solutions. Drives the engagement and alignment with the HPE IT/CDO team on Governance initiatives, including partnering with functional teams across the business. Test, validate and assure the quality of complex AI-powered product features. Partner with a highly motivated and talented set of colleagues. Be a motivated, self-starter who can operate with minimal handholding. Collaborate across teams and time zones, demonstrating flexibility and accountability Education And Experience Required 7+ years of Data Governance and Data Engineering experience, with significant exposure to enabling Data availability, data discovery, quality & reliability, with appropriate security & access controls in enterprise-scale ecosystem. First level university degree. What You Need To Bring Knowledge and Skills: Experience working with Data governance & metadata management tools (Collibra, Databricks Unity Catalog, Atlan, etc.). Subject matter expertise of consent management concepts and tools. Demonstrated knowledge of research methodology and the ability to manage complex data requests. Excellent analytical thinking, technical analysis, and data manipulation skills. Proven track record of development of SQL SSIS packages with ETL flow. Experience with AI application deployment governance a plus. Technologies such as MS SQL Server, Databricks, Hadoop, SAP S4/HANA. Experience with SQL databases and building SSIS packages; knowledge of NoSQL and event streaming (e.g., Kafka) is a bonus. Exceptional interpersonal skills and written communication skills. Experience and comfort solving problems in an ambiguous environment where there is constant change. Ability to think logically, communicate clearly, and be well organized. Strong knowledge of Computer Science fundamentals. Experience working with LLMs and generative AI frameworks (e.g., OpenAI, Hugging Face, etc.). Proficiency in MS Power Platform, Java, Scala, Python experience preferred. Strong collaboration and communication skills. Performing deep-dive investigations, including applying advanced techniques, to solve some of the most critical and complex business problems in support of business transformation to enable Product, Support, and Software as a Service offerings. Strong business acumen and technical knowledge within area of responsibility. Strong project management skills Additional Skills Accountability, Accountability, Active Learning (Inactive), Active Listening, Bias, Business Decisions, Business Development, Business Metrics, Business Performance, Business Strategies, Calendar Management, Coaching, Computer Literacy, Creativity, Critical Thinking, Cross-Functional Teamwork, Design Thinking, Empathy, Follow-Through, Growth Mindset, Intellectual Curiosity (Inactive), Leadership, Long Term Planning, Managing Ambiguity, Personal Initiative {+ 5 more} What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #aruba Job Business Planning Job Level Specialist HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 1 day ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 6-10 years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Role Summary: RHOO-RegHub On Olympus is a Regulatory Reporting framework built on Olympus Tech Stack to centrally report all in-scope transactions, events and client reports on a single, scalable, cost-effective regulatory architecture that mitigates regulatory and reputational risk through delivery of complete, accurate and compliant reporting for various businesses. We are looking for a Big Data and AI Lead responsible for driving our organization's efforts in leveraging big data and artificial intelligence (AI) to achieve Regulatory objectives. This role involves developing and implementing strategies for data collection, storage, processing, analysis, and AI-driven applications. By embracing AI/ML, RegHub On Olympus (RHOO) will remain at the forefront of regulatory reporting technology and provide a competitive advantage in the financial industry. Required Skillset : The role requires 10+ yrs hands on experience in Java development. Working experience in designing systems with Low Latency Streaming Architectures Spark Spring Boot Kafka (ELK) Flink or any real-time streaming framework experience. Working Knowledge of NoSQL, Kafka, Big Data, MongoDB Hands-on experience on Hadoop, Big Data, Spark SQL, Hive/Impala. Experience in delivering Regulatory ask like in extremely compressed timelines Experience is JMS and Real time message processing on TIBCO. Experience using Jira, Bit Bucket and managing development/testing/release efforts. Experience using XML, FIX, POJO, JSON message format. Experience with Impala, Elastic & Oracle. Working knowledge on AWS-ECS- Exp on Code Build/Code Pipeline for CI/CD & CloudWatch for logging/Monitoring Hands on Amazon Simple Storage Service (S3). Added advantage of having hands-on experience with AI/ML technologies, including Python, Predictive Modeling, Natural Language Processing, Machine Learning Algorithms, and general data structure modules. Good to have knowledge on Order and Executions and Trade Life Cycle Events & knowledge to work with message formats like FPML and ISO20022. Ability to lead a team from the front and guide them through time sensitive milestones. Focus on leveraging regulatory delivery as a driver for the firm’s data strategy. Very good communication and interpersonal skills. Excellent relationships with senior tech, business and compliance partners. Experience with delivery expertise for large scale programs. Ability to align delivery with firm’s long-term strategy. Quick decision maker with the ability to take a grasp of the situation in case of an issue and mitigate the impact. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 day ago

Apply

13.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 13+ years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Working experience in designing systems with Low Latency Streaming Architectures Working Knowledge of NoSQL, Kafka, Big Data, MongoDB Hands-on experience on Hadoop, Big Data, Spark SQL, Hive/Impala. Experience in delivering Regulatory ask like in extremely compressed timelines Experience is JMS and Real time message processing on TIBCO. Experience using Jira, Bit Bucket and managing development/testing/release efforts. Experience using XML, FIX, POJO, JSON message format. Experience with Impala, Elastic & Oracle. Working knowledge on AWS-ECS- Exp on Code Build/Code Pipeline for CI/CD & CloudWatch for logging/Monitoring Hands on Amazon Simple Storage Service (S3). Added advantage of having hands-on experience with AI/ML technologies, including Python, Predictive Modeling, Natural Language Processing, Machine Learning Algorithms, and general data structure modules. Good to have knowledge on Order and Executions and Trade Life Cycle Events & knowledge to work with message formats like FPML and ISO20022. Ability to lead a team from the front and guide them through time sensitive milestones. Focus on leveraging regulatory delivery as a driver for the firm’s data strategy. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 day ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organization, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. VOIS India In 2009, VOIS started operating in India and now has established global delivery centers in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VOIS India supports global markets and group functions of Vodafone and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Core Competencies, Knowledge And Experience Mode: Hybrid Location: Bangalore / Pune Experience 5 To 8 Years Core Competencies, Knowledge and Experience: Minimum experience of 4 years in data science. Excellent communication & presentation skills with track record of engaging with business project leads Flexibility and problem solving Previous experience in Telecom would be a distinct advantage Graduate from tier-1 or tier-2 college preferred with good academic record Must Have Technical / Professional Qualifications Hands on experience in Analytical tools like Python with through knowledge of various machine learning, analytical & statistical techniques Expert in Python Hands on experience in NLP and Generative AI tools and large language models (LLMs) like gemini, PALM, open source LLMs, transformers. Good to have Cloud – GCP/AWS or Big Data technologies (e.g. Hadoop, Hive, Pig, etc.) Need to be worked on minimum 3 modelling projects till now in different areas: Predictive Modelling Mandatory kin actual business projects with monetary impact Exceptional data manipulation and analysis techniques; comfortable using very large (>10’s millions rows) datasets, containing both structured and unstructured data Machine Learning, Deep Learning, Clustering & Segmentation, Binary & Multi-Class Classification, ML Forecasting & Regression Models Good stability in employments and from reputed tier-1 employers in past. Key Accountabilities And Decision Ownership Support build & deployment of analytical solutions (models, hypothesis, analyses & scenarios) across the spectrum of analytical maturity - descriptive, inferential, predictive & prescriptive Leverage previously created data models, insights and analyses from across the Vodafone business to drive positive business outcomes Drive efficiencies in how analytical services are delivered, through automation and standardisation of analytical delivery VOIS Equal Opportunity Employer Commitment VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, color, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process – through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services) Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects; Experience with time-series/analytics dB’s such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF’s for analysing semi structured/structured datasets Location : Pune and Bangalore You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 1 day ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 1 day ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 1 day ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 1 day ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 1 day ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 1 day ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description Works throughout the software development life cycle and performs in a utility capacity to create, design, code, debug, maintain, test, implement and validate applications with a broad understanding of a variety of languages and architectures. Analyzes existing applications or formulate logic for new applications, procedures, flowcharting, coding and debugging programs. Maintains and utilizes application and programming documents in the development of code. Recommends changes in development, maintenance and system standards. Creates appropriate deliverables and develops application implementation plans throughout the life cycle in a flexible development environment Summary Of This Role Works throughout the software development life cycle and performs in a utility capacity to create, design, code, debug, maintain, test, implement and validate applications with a broad understanding of a variety of languages and architectures. Analyzes existing applications or formulate logic for new applications, procedures, flowcharting, coding and debugging programs. Maintains and utilizes application and programming documents in the development of code. Recommends changes in development, maintenance and system standards. Creates appropriate deliverables and develops application implementation plans throughout the life cycle in a flexible development environment. What Part Will You Play? Develops moderately complex code using both front and/or back end programming languages within multiple platforms as needed in collaboration with business and technology teams for internal and external client software solutions. Designs, creates, and delivers moderately complex program specifications for code development and support on multiple projects/issues with a wide understanding of the application / database to better align interactions and technologies. Provides broad and in-depth knowledge of analysis, modification, and development of complex code/unit testing in order to develop concise application documentation. Performs and advises on testing, validation requirements, and corrective measures for complex code deficiencies and provides systemic proposals. Participates in client facing meetings, joint venture discussions, vendor partnership teams to determine solution approaches. Provides advise to leadership on the design, development and enforcement of business / infrastructure application standards to include associated controls, procedures and monitoring to ensure compliance and accuracy of data. Applies a full understanding and in-depth knowledge of procedures, methodology and application standards to include Payment Card Industry (PCI) security compliance. Develops, administers and recommends billable hours and resource estimates on complex initiatives, projects, and issues. Assists with on-the-job training and provides in-depth expertise and advice to software engineers. What Are We Looking For in This Role? Key Applicant Requirements (Skills/Knowledge/Experience/Qualifications) 7+ years of strong development background in Java/J2EE Technologies Strong experience in Core Java, Advanced Java, Design Patterns, Algorithms, Session Management, OOPS concepts, data structure, Collections, interface, multi-threading & MVC Experience in building API’s and web services (SOAP and Restful) & integrating with external applications Experience in JAVA 8(Lamda, Streams, DateTime API etc.) Experience in Spring Framework(Spring Boot, Spring Security) Experience in application server(e.g., Apache, Tomcat, JBOSS) Experience in build tool (e.g., Maven, Gradle etc.,) Has used Version controlling using GIT, VSTS & Teamforge etc Developing an application using Eclipse IDE or IntelliJ is preferred Experience in queue optimization and configuration for high throughput for request-reply pattern and single and federated cluster management Good knowledge of Relational Databases, SQL & JDBC drivers Strong analytical, planning, and organizational skills with an ability to manage competing demand Excellent communication skills, verbal and written; should be able to collaborate across business teams (stakeholders) and other technology groups as needed. Experience in NO-SQL databases Exposure to Payments industry is a plus Key Position Details (Responsibilities) Experience as a Middleware developer using Java/J2EE technologies and Message queues Experience in administration and setup of any Message Broker like IIB, RabbitMQ etc Experience with API Gateways - Datapower, APIM , Apigee etc Experience in cloud platform – AWS/Azure/GCP Experience in Google cloud platform - GCP Pub/Sub, Datastore, BigQuery, AppEngine, Compute Engine, Cloud SQL, Memory Store, Redis etc Minimum Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field Typically minimum of 8 years - Professional Experience In Coding, Designing, Developing And Analyzing Data. Typically has an advanced knowledge and use of two or more opposing front / back end languages / technologies from the following but not limited to; two or more modern programming languages used in the enterprise, experience working with various APIs, external Services, experience with both relational and NoSQL Databases Preferred Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field 7+ years professional Experience In Coding, Designing, Developing And Analyzing Data and experience with IBM Rational Tools What Are Our Desired Skills and Capabilities? Skills / Knowledge - Having wide-ranging experience, uses professional concepts and company objectives to resolve complex issues in creative and effective ways. Some barriers to entry exist at this level (e.g., dept./peer review). Job Complexity - Works on complex issues where analysis of situations or data requires an in-depth evaluation of variable factors. Exercises judgment in selecting methods, techniques and evaluation criteria for obtaining results. Networks with key contacts outside own area of expertise. Supervision - Determines methods and procedures on new assignments and may coordinate activities of other personnel (Team Lead). Operating Systems: Linux distributions including one or more for the following: Ubuntu, CentOS/RHEL, Amazon Linux Microsoft Windows Database - Design, familiarity with DDL and DML for one or more of the following databases Oracle, MySQL, MS SQL Server, IMS, DB2, Hadoop Back-end technologies - Java, Python, .NET, Ruby, Mainframe COBOL, Mainframe Assembler Front-end technologies - HTML, JavaScript, jQuery, CICS Web Frameworks – Web technologies like Node.js, React.js, Angular, Redux Development Tools - Eclipse, Visual Studio, Webpack, Babel, Gulp

Posted 1 day ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Description Summary of This Role Works throughout the software development life cycle and performs in a utility capacity to create, design, code, debug, maintain, test, implement and validate applications with a broad understanding of a variety of languages and architectures. Analyzes existing applications or formulate logic for new applications, procedures, flowcharting, coding and debugging programs. Maintains and utilizes application and programming documents in the development of code. Recommends changes in development, maintenance and system standards. Creates appropriate deliverables and develops application implementation plans throughout the life cycle in a flexible development environment. What Part Will You Play? Develops basic to moderately complex code using front and / or back end programming languages within multiple platforms as needed in collaboration with business and technology teams for internal and external client software solutions. Designs, creates, and delivers routine to moderately complex program specifications for code development and support on multiple projects/issues with a wide understanding of the application / database to better align interactions and technologies. Analyzes, modifies, and develops moderately complex code/unit testing in order to develop concise application documentation. Performs testing and validation requirements for moderately complex code changes. Performs corrective measures for moderately complex code deficiencies and escalates alternative proposals. Participates in client facing meetings, joint venture discussions, vendor partnership teams to determine solution approaches. Provides support to leadership for the design, development and enforcement of business / infrastructure application standards to include associated controls, procedures and monitoring to ensure compliance and accuracy of data. Applies a full understanding of procedures, methodology and application standards to include Payment Card Industry (PCI) security compliance. Conducts and provides basic billable hours and resource estimates on initiatives, projects and issues. Assists with on-the-job training and provides guidance to other software engineers. What Are We Looking For in This Role? Minimum Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field Typically minimum of 4 years - Professional Experience In Coding, Designing, Developing And Analyzing Data. Typically has an advanced knowledge and use of one or more front / back end languages / technologies and a moderate understanding of the other corresponding end language / technology from the following but not limited to; two or more modern programming languages used in the enterprise, experience working with various APIs, external Services, experience with both relational and NoSQL Databases. Preferred Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field 6+ years professional Experience In Coding, Designing, Developing And Analyzing Data and experience with IBM Rational Tools What Are Our Desired Skills and Capabilities? Skills / Knowledge - A seasoned, experienced professional with a full understanding of area of specialization; resolves a wide range of issues in creative ways. This job is the fully qualified, career-oriented, journey-level position. Job Complexity - Works on problems of diverse scope where analysis of data requires evaluation of identifiable factors. Demonstrates good judgment in selecting methods and techniques for obtaining solutions. Networks with senior internal and external personnel in own area of expertise. Supervision - Normally receives little instruction on day-to-day work, general instructions on new assignments. Operating Systems: Linux distributions including one or more for the following: Ubuntu, CentOS/RHEL, Amazon Linux Microsoft Windows z/OS Tandem/HP-Nonstop Database - Design, familiarity with DDL and DML for one or more of the following databases Oracle, MySQL, MS SQL Server, IMS, DB2, Hadoop Back-end technologies - Java, Python, .NET, Ruby, Mainframe COBOL, Mainframe Assembler Front-end technologies - HTML, JavaScript, jQuery, CICS Web Frameworks – Web technologies like Node.js, React.js, Angular, Redux Development Tools - Eclipse, Visual Studio, Webpack, Babel, Gulp Mobile Development – iOS, Android Machine Learning – Python, R, Matlab, Tensorflow, DMTK

Posted 1 day ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview*: Batch Support Services team is responsible for all Retail, Preferred and Global Wealth & Investment Management (GWIM) business aligned infrastructure and provides stability and resiliency in a standardized production like end-to-end batch testing environment to enhance the speed to market capabilities by stability and resiliency , by Balancing MIPS batch workload in the test environments ,Leveraging automated scheduling, notifications, and dashboards, Centralizing batch execution by Domain, incorporating common tools & ensuring automated hand-offs of critical processes, Evaluating and building out critical batch environments encompassing mainframe and mid-range batch applications reducing manual intervention Job Description As a member of Batch Support Services team, the person will be responsible for supporting Midrange (Datastage & Autosys), Hadoop batch, scheduling support and maintenance across multiple test environments and Batch Execution. Use support tools to navigate through logs in problem analysis and adhering to standards and procedures for technical and change implementation of scheduling support. Identify and implement opportunities for process improvements, potential risks, and increased efficiencies as part of Batch Optimization. Ability to work in cross functional and multi-location teams. Responsibilities: Analysis and support of batch application testing of Mid-Range (Datastage & Autosys, Hadoop) batch components in integrated and independent test environments. Understand functionalities of change and problem requests. Analyze the batch issues and provide the resolutions for midrange applications. Analysis of impact in the existing system and estimation. Work with multi-platform batch application teams to optimize testing capabilities and production deployments. Analyze, develop batch components for midrange using IIS Datastage - Datastage, Autosys, PySpark, Hadoop, UNIX Shell Scripting Optimize the multi-platform batch applications (midrange and mainframe) using testing capabilities. Support test batch execution for midrange and mainframe applications as part of integrated and independent releases. Understand functionalities of change and problem requests & batch optimization based on the system understanding. Write UNIX shell scripting, for various functions such as maintenance, backup, and server health checks. Perform application support activities using Endevor & Subversion (SVN). Co-ordinate with required stakeholders (Release Management, Data Management and Configuration Management) to support the project. Requirements*: Education* B.E./ B. Tech/M.E./M. Tech/BSC/MSC/BCA/MCA (prefer IT/CS specialization) Certifications If Any – NA Experience Range- 2 – 4 yrs Foundational Skills: Datastage, Autosys, Hadoop, PySpark UNIX Shell Scripting Desired Skills Mainframe Batch Experience with CA7, JCL, TSO, NDM, Cobol, JCL DB2, IMS Work Timings*: *Rotational Shift (6:30 AM to 10:30 PM any shift IST). Will be required to work in shifts for coverage during offshore hours including weekends. Job Location: Chennai, Hyderabad, Mumbai,Gurugram,Giftcity

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies