Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Hyderābād
On-site
JOB DESCRIPTION You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.
Posted 5 days ago
9.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Generative AI – Application Developer EY’s GDS Tax Technology team’s mission is to develop, implement and integrate technology solutions that better serve our clients and engagement teams. As a member of EY’s core Tax practice, you’ll develop a deep tax technical knowledge and outstanding database, data analytics and programming skills. Ever-increasing regulations require tax departments to gather, organize and study more data than ever before. Often the data necessary to satisfy these ever-increasing and complex regulations must be collected from a variety of systems and departments throughout an organization. Effectively and efficiently handling the variety and volume of data is often extremely challenging and time consuming for a company. EY's GDS Tax Technology team members work side-by-side with the firm's partners, clients and tax technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies and enable our clients with disruptive and market leading tools supporting Tax. GDS Tax Technology works closely with clients and professionals in the following areas: Federal Business Tax Services, Partnership Compliance, Corporate Compliance, Indirect Tax Services, Human Capital, and Internal Tax Services. GDS Tax Technology provides solution architecture, application development, testing and maintenance support to the global TAX service line both on a pro-active basis and in response to specific requests. EY is currently seeking a Generative AI – Application Developer (.NET) to join our Tax Technology practice in Bangalore & Kolkata India. The opportunity We’re looking for Tax Seniors with expertise in Full-stack Application Development using .NET C# for Generative AI applications to join the TTT team in Tax Service Line. This is a fantastic opportunity to be part of a pioneer firm whilst being instrumental in the growth of a new service offering. Your Key Responsibilities Design, develop, and implement AI agents/plugins/interfaces and APIs, ensuring integration with various systems aligns with the core product/ platform development strategy. Estimate and manage technical efforts, including work breakdown structures, risks, and solutions, while adhering to development methodologies and KPIs. Maintain effective communication within the team and with stakeholders, proactively managing expectations and collaborating on problem-solving. Contribute to the refinement of development/engineering methodologies and standards, anticipating potential issues and leading the resolution process. Skills And Attributes For Success Must-Have: Skilled in full-stack application development with .NET C#, REST Api, React or any other typescript based UI frameworks, SQL databases Advanced knowledge of Azure services such as Azure app services, Azure Functions, Entra ID etc. Containerisation – Docker, Azure container apps, Azure Kubernetes Services (AKS) No-SQL database such Cosmos or Mongo DB Working experience with source control such as git or TFVC CI/CD pipelines, Azure DevOps, GitHub Actions etc. Generative AI application development with Azure OpenAI, Semantic Kernel, and Vector databases like Azure AI search, Postgres, etc. Fundamental understanding of various types of Large Language Models (LLMs) Fundamental understanding of Retrieval Augment Generation (RAG) techniques Fundamental understanding of classical AI/ML Skilled in Advanced prompt engineering Nice-to-Have: Awareness about various AI Agents/ Agentic workflow frameworks and SDKs Graph Database such as Neo4j Experience with M365 Copilot Studio Microsoft Azure AI-900/ AI-102 Certification Behavioural Skills: Excellent learning ability. Strong communication skill. Flexibility to work both independently and as part of a larger team. Strong analytical skills and attention to detail. The ability to adapt your work style to work with both internal and client team members. To qualify for the role, you must have Bachelor’s / master’s degree in software engineering / information technology / BE/ B.TECH An overall 5 – 9 years of experience. Ideally, you’ll also have Thorough knowledge Tax or Finance Domain. Strong analytical skills and attention to detail. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY TAS practices globally with leading businesses across a range of industries What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success, as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 5 days ago
2.0 years
3 Lacs
Coimbatore
On-site
Technical Expertise : (minimum 2 year relevant experience) ● Solid understanding of Generative AI models and Natural Language Processing (NLP) techniques, including Retrieval-Augmented Generation (RAG) systems, text generation, and embedding models. ● Exposure to Agentic AI concepts, multi-agent systems, and agent development using open-source frameworks like LangGraph and LangChain. ● Hands-on experience with modality-specific encoder models (text, image, audio) for multi-modal AI applications. ● Proficient in model fine-tuning, prompt engineering, using both open-source and proprietary LLMs. ● Experience with model quantization, optimization, and conversion techniques (FP32 to INT8, ONNX, TorchScript) for efficient deployment, including edge devices. ● Deep understanding of inference pipelines, batch processing, and real-time AI deployment on both CPU and GPU. ● Strong MLOps knowledge with experience in version control, reproducible pipelines, continuous training, and model monitoring using tools like MLflow, DVC, and Kubeflow. ● Practical experience with scikit-learn, TensorFlow, and PyTorch for experimentation and production-ready AI solutions. ● Familiarity with data preprocessing, standardization, and knowledge graphs (nice to have). ● Strong analytical mindset with a passion for building robust, scalable AI solutions. ● Skilled in Python, writing clean, modular, and efficient code. ● Proficient in RESTful API development using Flask, FastAPI, etc., with integrated AI/ML inference logic. ● Experience with MySQL, MongoDB, and vector databases like FAISS, Pinecone, or Weaviate for semantic search. ● Exposure to Neo4j and graph databases for relationship-driven insights. ● Hands-on with Docker and containerization to build scalable, reproducible, and portable AI services. ● Up-to-date with the latest in GenAI, LLMs, Agentic AI, and deployment strategies. ● Strong communication and collaboration skills, able to contribute in cross-functional and fast-paced environments. Bonus Skills ● Experience with cloud deployments on AWS, GCP, or Azure, including model deployment and model inferencing. ● Working knowledge of Computer Vision and real-time analytics using OpenCV, YOLO, and similar Job Type: Full-time Pay: From ₹300,000.00 per year Schedule: Day shift Experience: AI Engineer: 1 year (Required) Work Location: In person Expected Start Date: 23/06/2025
Posted 5 days ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Summary Pfizer’s purpose is to deliver breakthroughs that change patients’ lives. Research and Development is at the heart of fulfilling Pfizer’s purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world. Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. The successful candidate will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes. Role Responsibilities Lead data modeling and engineering efforts within advanced data platforms teams to achieve digital outcomes. Provides guidance and may lead/co-lead moderately complex projects. Oversee the development and execution of test plans, creation of test scripts, and thorough data validation processes. Lead the architecture, design, and implementation of Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs. Lead the development of complex data products that benefit PGS and ensure reusability across the enterprise. Collaborate effectively with contractors to deliver technical enhancements. Oversee the development of automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment. Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency. Conduct root cause analysis and address production data issues. Lead the design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives. Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects. Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives. Document and present findings, methodologies, and project outcomes to various stakeholders. Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery. Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection. Basic Qualifications A bachelor's or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline. Over 4 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations. Over 2 years of experience in AI, machine learning, and large language models (LLMs) development and deployment. Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred. Strong understanding of data structures, algorithms, and software design principles Programming Languages: Proficiency in Python, SQL, and familiarity with Java or Scala AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices Preferred Qualifications Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica. Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing. Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Containerization: Understanding of Docker and Kubernetes for containerization and orchestration. Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files. Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune. Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets. Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch. Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship. Non-standard Work Schedule, Travel Or Environment Requirements Occasionally travel required Work Location Assignment: Hybrid The annual base salary for this position ranges from $96,300.00 to $160,500.00. In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 12.5% of the base salary and eligibility to participate in our share based long term incentive program. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States. Relocation assistance may be available based on business needs and/or eligibility. Sunshine Act Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations. These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure. Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act. Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government. If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative. EEO & Employment Eligibility Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status. Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA. Pfizer is an E-Verify employer. This position requires permanent work authorization in the United States. Information & Business Tech Show more Show less
Posted 5 days ago
4.0 - 7.0 years
13 - 17 Lacs
Pune
Work from Office
Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 5 days ago
6.0 - 11.0 years
25 - 30 Lacs
Chennai
Work from Office
Internal Data Structures & Modeling Design, maintain, and optimize internal data models and structures within the Flexera environment. Map business asset data to Flexeras normalized software models with precision and accuracy. Ensure accurate data classification, enrichment, and normalization to support software lifecycle tracking. Partner with infrastructure, operations, and IT teams to ingest and reconcile data from various internal systems. Reporting & Analytics Design and maintain reports and dashboards in Flexera or via external BI tools such as Power BI or Tableau. Provide analytical insights on software usage, compliance, licensing, optimization, and risk exposure. Automate recurring reporting processes and ensure timely delivery to business stakeholders. Work closely with business users to gather requirements and translate them into meaningful reports and visualizations. Automated Data Feeds & API Integrations Develop and support automated data feeds using Flexera REST/SOAP APIs. Integrate Flexera with enterprise tools (e.g., CMDB, SCCM, ServiceNow, ERP) to ensure reliable and consistent data flow. Monitor, troubleshoot, and resolve issues related to data extracts and API communication. Implement robust logging, alerting, and exception handling for integration pipelines. Skills Must have Minimum 6+ years of working with Flexera or similar software. Flexera Expertise: Strong hands-on experience with Flexera One, FlexNet Manager Suite, or similar tools. Technical Skills: Proficient in REST/SOAP API development and integration. Strong SQL skills and familiarity with data transformation/normalization concepts. Experience using reporting tools like Power BI, Tableau, or Excel for data visualization. Familiarity with enterprise systems such as SCCM, ServiceNow, ERP, CMDBs, etc. Process & Problem Solving: Strong analytical and troubleshooting skills for data inconsistencies and API failures. Understanding of license models, software contracts, and compliance requirements. Nice to have Soft Skills: Excellent communication skills to translate technical data into business insights. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Data Engineer with Neo4j Data Science India Gurugram Data Engineer with Neo4j Data Science India Bengaluru Data Scientist Data Science India Bengaluru Chennai, India Req. VR-114544 Data Science BCM Industry 23/05/2025 Req. VR-114544 Apply for Senior Flexera Data Analyst in Chennai *
Posted 6 days ago
6.0 - 11.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Internal Data Structures & Modeling Design, maintain, and optimize internal data models and structures within the Flexera environment. Map business asset data to Flexeras normalized software models with precision and accuracy. Ensure accurate data classification, enrichment, and normalization to support software lifecycle tracking. Partner with infrastructure, operations, and IT teams to ingest and reconcile data from various internal systems. Reporting & Analytics Design and maintain reports and dashboards in Flexera or via external BI tools such as Power BI or Tableau. Provide analytical insights on software usage, compliance, licensing, optimization, and risk exposure. Automate recurring reporting processes and ensure timely delivery to business stakeholders. Work closely with business users to gather requirements and translate them into meaningful reports and visualizations. Automated Data Feeds & API Integrations Develop and support automated data feeds using Flexera REST/SOAP APIs. Integrate Flexera with enterprise tools (e.g., CMDB, SCCM, ServiceNow, ERP) to ensure reliable and consistent data flow. Monitor, troubleshoot, and resolve issues related to data extracts and API communication. Implement robust logging, alerting, and exception handling for integration pipelines. Skills Must have Minimum 6+ years of working with Flexera or similar software. Flexera Expertise: Strong hands-on experience with Flexera One, FlexNet Manager Suite, or similar tools. Technical Skills: Proficient in REST/SOAP API development and integration. Strong SQL skills and familiarity with data transformation/normalization concepts. Experience using reporting tools like Power BI, Tableau, or Excel for data visualization. Familiarity with enterprise systems such as SCCM, ServiceNow, ERP, CMDBs, etc. Process & Problem Solving: Strong analytical and troubleshooting skills for data inconsistencies and API failures. Understanding of license models, software contracts, and compliance requirements. Nice to have Soft Skills: Excellent communication skills to translate technical data into business insights. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Data Engineer with Neo4j Data Science India Chennai Data Engineer with Neo4j Data Science India Gurugram Business Analyst Data Science Poland Remote Poland Bengaluru, India Req. VR-114544 Data Science BCM Industry 23/05/2025 Req. VR-114544 Apply for Senior Flexera Data Analyst in Bengaluru *
Posted 6 days ago
6.0 - 11.0 years
25 - 30 Lacs
Gurugram
Work from Office
Internal Data Structures & Modeling Design, maintain, and optimize internal data models and structures within the Flexera environment. Map business asset data to Flexeras normalized software models with precision and accuracy. Ensure accurate data classification, enrichment, and normalization to support software lifecycle tracking. Partner with infrastructure, operations, and IT teams to ingest and reconcile data from various internal systems. Reporting & Analytics Design and maintain reports and dashboards in Flexera or via external BI tools such as Power BI or Tableau. Provide analytical insights on software usage, compliance, licensing, optimization, and risk exposure. Automate recurring reporting processes and ensure timely delivery to business stakeholders. Work closely with business users to gather requirements and translate them into meaningful reports and visualizations. Automated Data Feeds & API Integrations Develop and support automated data feeds using Flexera REST/SOAP APIs. Integrate Flexera with enterprise tools (e.g., CMDB, SCCM, ServiceNow, ERP) to ensure reliable and consistent data flow. Monitor, troubleshoot, and resolve issues related to data extracts and API communication. Implement robust logging, alerting, and exception handling for integration pipelines. Skills Must have Minimum 6+ years of working with Flexera or similar software. Flexera Expertise: Strong hands-on experience with Flexera One, FlexNet Manager Suite, or similar tools. Technical Skills: Proficient in REST/SOAP API development and integration. Strong SQL skills and familiarity with data transformation/normalization concepts. Experience using reporting tools like Power BI, Tableau, or Excel for data visualization. Familiarity with enterprise systems such as SCCM, ServiceNow, ERP, CMDBs, etc. Process & Problem Solving: Strong analytical and troubleshooting skills for data inconsistencies and API failures. Understanding of license models, software contracts, and compliance requirements. Nice to have Soft Skills: Excellent communication skills to translate technical data into business insights. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Data Engineer with Neo4j Data Science India Chennai Data Engineer with Neo4j Data Science India Bengaluru Data Scientist Data Science India Bengaluru Gurugram, India Req. VR-114544 Data Science BCM Industry 23/05/2025 Req. VR-114544 Apply for Senior Flexera Data Analyst in Gurugram *
Posted 6 days ago
3.0 - 5.0 years
13 - 15 Lacs
Bengaluru
Work from Office
Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Flexera Data Analyst Data Science India Gurugram Senior Flexera Data Analyst Data Science India Chennai Business Analyst Data Science Poland Remote Poland Bengaluru, India Req. VR-114556 Data Science BCM Industry 23/05/2025 Req. VR-114556 Apply for Data Engineer with Neo4j in Bengaluru *
Posted 6 days ago
3.0 - 5.0 years
13 - 15 Lacs
Gurugram
Work from Office
Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Flexera Data Analyst Data Science India Bengaluru Senior Flexera Data Analyst Data Science India Chennai Data Scientist Data Science India Bengaluru Gurugram, India Req. VR-114556 Data Science BCM Industry 23/05/2025 Req. VR-114556 Apply for Data Engineer with Neo4j in Gurugram *
Posted 6 days ago
3.0 - 5.0 years
13 - 15 Lacs
Chennai
Work from Office
Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Flexera Data Analyst Data Science India Gurugram Senior Flexera Data Analyst Data Science India Bengaluru Data Scientist Data Science India Bengaluru Chennai, India Req. VR-114556 Data Science BCM Industry 23/05/2025 Req. VR-114556 Apply for Data Engineer with Neo4j in Chennai *
Posted 6 days ago
2.0 - 5.0 years
4 - 8 Lacs
Chennai
Work from Office
Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets.
Posted 6 days ago
2.0 - 7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Engineering AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Job Title: Generative AI Developer Job Summary: We are looking for a Generative AI Developer with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Deliver large-scale AI/Gen AI projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements and project requirements Work with a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 2-7 years of relevant hands-on experience in Generative AI, Deep Learning, or NLP Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 303628 Show more Show less
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and Deliver large-scale AI/Gen AI projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements and project requirements Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or Master’s degree in a quantitative field. Led a 3-5 member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 303629 Show more Show less
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and Deliver large-scale AI/Gen AI projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements and project requirements Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or Master’s degree in a quantitative field. Led a 3-5 member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 303629 Show more Show less
Posted 6 days ago
2.0 - 7.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary AI & Engineering AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Job Title: Generative AI Developer Job Summary: We are looking for a Generative AI Developer with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Deliver large-scale AI/Gen AI projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements and project requirements Work with a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 2-7 years of relevant hands-on experience in Generative AI, Deep Learning, or NLP Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 303628 Show more Show less
Posted 6 days ago
3.0 years
4 - 7 Lacs
Hyderābād
On-site
Overview: Software Developer (Full Stack Javascript/Angular/C#/NodeJS Developer) The role offers the right developer a chance to work on our enterprise scale web application in an established, but evolving company that is looking to new technology and great design to build products that leapfrog the competition. Responsibilities: This full-stack developer will have a good familiarity with HTML5/CSS/Javascript and be competent using at least one modern JS framework. (e.g. Angular, React, Vue). They will be confident using C#.NET/Java/NodeJS or equivalent backend technology. They will have excellent object-oriented programming skills and understand the importance of code refactoring and optimization. They will also be comfortable configuring web servers (e.g. IIS or Apache Tomcat) and performing other back-end maintenance tasks. The ideal candidate will have a minimum of 3 years commercial Javascript/Web development with backend, as well as understanding a selection of the most recent technologies for both development, automated testing, and deployment automation. (e.g. MVC, NodeJS, Docker, Kubernetes, Selenium, Nunit, Karma/Jasmine, Azure, Vagrant, Chef, Puppet etc) As a confident developer, they would also be expected to lead juniors, and assist with the architecture and technology choices for new components. Including the building of Proof of Concepts, establishing and improving the build-chain, and constructing test automation elements that aid the development process, where appropriate. In addition to technical skills, they will be proactive in learning new technologies, be fascinated with AI, and likely have experimented with its use. LOVE coding and thrive in a fast-paced team environment. Qualifications: Bachelor’s degree (2:1 from a recognized university) Skills and experience Full Stack Role: Javascript + C#.NET/Java/NodeJS + IIS/Apache Desired: AI, APIs, SQL Server, No SQL / Graph Databases (Neo4J, MongoDB etc), Authentication and Security (SAML2 etc) Company Description At Quest, we create and manage the software that makes the benefits of new technology real. Companies turn to us to manage, modernize and secure their business, from on-prem to in-cloud, from the heart of the network to the vulnerable endpoints. From complex challenges like Active Directory management and Office 365 migration, to database and systems management, to redefining security, and hundreds of needs in between, we help you conquer your next challenge now. We’re not the company that makes big promises. We’re the company that fulfills them. We’re Quest: Where Next Meets Now. Why work with us! Life at Quest means collaborating with dedicated professionals with a passion for technology. When we see something that could be improved, we get to work inventing the solution. Our people demonstrate our winning culture through positive and meaningful relationship. We invest in our people and offer a series of programs that enables them to pursue a career that fulfills their potential. Our team members’ health and wellness is our priority as well as rewarding them for their hard work. Quest is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. Come join us. For more information, visit us on the web at Quest Careers | Where next meets now. Join Quest. Job seekers should be aware of fraudulent job offers from online scammers and only apply to roles listed on quest.com/careers using our applicant system. Note: We do not use text messaging or third-party messaging apps like Telegram to communicate with applicants, so please exercise caution if you are approached in this way and only interact with people claiming to be Quest employees if they have an email address ending in @quest.com or @oneidentity.com #LI-SR1
Posted 6 days ago
0 years
0 Lacs
Chandigarh, India
On-site
Company Profile Oceaneering is a global provider of engineered services and products, primarily to the offshore energy industry. We develop products and services for use throughout the lifecycle of an offshore oilfield, from drilling to decommissioning. We operate the world's premier fleet of work class ROVs. Additionally, we are a leader in offshore oilfield maintenance services, umbilicals, subsea hardware, and tooling. We also use applied technology expertise to serve the defense, entertainment, material handling, aerospace, science, and renewable energy industries. Since year 2003, Oceaneering’s India Center has been an integral part of operations for Oceaneering’s robust product and service offerings across the globe. This center caters to diverse business needs, from oil and gas field infrastructure, subsea robotics to automated material handling & logistics. Our multidisciplinary team offers a wide spectrum of solutions, encompassing Subsea Engineering, Robotics, Automation, Control Systems, Software Development, Asset Integrity Management, Inspection, ROV operations, Field Network Management, Graphics Design & Animation, and more. In addition to these technical functions, Oceaneering India Center plays host to several crucial business functions, including Finance, Supply Chain Management (SCM), Information Technology (IT), Human Resources (HR), and Health, Safety & Environment (HSE). Our world class infrastructure in India includes modern offices, industry-leading tools and software, equipped labs, and beautiful campuses aligned with the future way of work. Oceaneering in India as well as globally has a great work culture that is flexible, transparent, and collaborative with great team synergy. At Oceaneering India Center, we take pride in “Solving the Unsolvable” by leveraging the diverse expertise within our team. Join us in shaping the future of technology and engineering solutions on a global scale. Position Summary The Principal Data Scientist will develop Machine Learning and/or Deep Learning based integrated solutions that address customer needs such as inspection topside and subsea. They will also be responsible for development of machine learning algorithms for automation and development of data analytics programs for Oceaneering’s next generation systems. The position requires the Principal Data Scientist to work with various Oceaneering Business units across global time zones but also offers the flexibility to work in a Hybrid Work-office environment. Essential Duties And Responsibilities Lead and supervise a team of moderately experienced engineers on product/prototype design & development assignments or applications. Work both independently and collaboratively to develop custom data models and algorithms to apply on data sets that will be deployed in existing and new products. Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Build data models and organize structured and unstructured data to interpret solutions. Prepares data for predictive and prescriptive modeling. Architect solutions by selection of appropriate technology and components Determines the technical direction and strategy for solving complex, significant, or major issues. Plans and evaluates architectural design and identifies technical risks and associated ways to mitigate those risks. Prepares design proposals to reflect cost, schedule, and technical approaches. Recommends test control, strategies, apparatus, and equipment. Develop, construct, test, and maintain architectures. Lead research activities for ongoing government and commercial projects and products. Collaborate on proposals, grants, and publications in algorithm development. Collect data as warranted to support the algorithm development efforts. Work directly with software engineers to implement algorithms into commercial software products. Work with third parties to utilize off the shelf industrial solutions. Algorithm development on key research areas based on client’s technical problem. This requires constant paper reading, and staying ahead of the game by knowing what is and will be state of the art in this field. Ability to work hands-on in cross-functional teams with a strong sense of self-direction. Non-essential Develop an awareness of programming and design alternatives Cultivate and disseminate knowledge of application development best practices Gather statistics and prepare and write reports on the status of the programming process for discussion with management and/or team members Direct research on emerging application development software products, languages, and standards in support of procurement and development efforts Train, manage and provide guidance to junior staff Perform all other duties as requested, directed or assigned Supervisory Responsibilities This position does not have direct supervisory responsibilities. Re Reporting Relationship Engagement Head Qualifications REQUIRED Bachelor’s degree in Electronics and Electrical Engineering (or related field) with eight or more years of past experience working on Machine Learning and Deep Learning based projects OR Master’s degree in Data Science (or related field) with six or more years of past experience working on Machine Learning and Deep Learning based projects DESIRED Strong knowledge of advanced statistical functions: histograms and distributions, Regression studies, scenario analysis etc. Proficient in Object Oriented Analysis, Design and Programming Strong background in Data Engineering tools like Python/C#, R, Apache Spark, Scala etc. Prior experience in handling large amount of data that includes texts, shapes, sounds, images and/or videos. Knowledge of SaaS Platforms like Microsoft Fabric, Databricks, Snowflake, h2o etc. Background experience of working on cloud platforms like Azure (ML) or AWS (SageMaker), or GCP (Vertex), etc. Proficient in querying SQL and NoSQL databases Hands on experience with various databases like MySQL/PostgreSQL/Oracle, MongoDB, InfluxDB, TimescaleDB, neo4j, Arango, Redis, Cassandra, etc. Prior experience with at least one probabilistic/statistical ambiguity resolution algorithm Proficient in Windows and Linux Operating Systems Basic understanding of ML frameworks like PyTorch and TensorFlow Basic understanding of IoT protocols like Kafka, MQTT or RabbitMQ Prior experience with bigdata platforms like Hadoop, Apache Spark, or Hive is a plus. Knowledge, Skills, Abilities, And Other Characteristics Ability to analyze situations accurately, utilizing a variety of analytical techniques in order to make well informed decisions Ability to effectively prioritize and execute tasks in a high-pressure environment Skill to gather, analyze and interpret data. Ability to determine and meet customer needs Ensures that others involved in a project or effort are kept informed about developments and plans Knowledge of communication styles and techniques Ability to establish and maintain cooperative working relationships Skill to prioritize workflow in a changing work environment Knowledge of applicable data privacy practices and laws Strong analytical and problem-solving skills. Additional Information This position is considered OFFICE WORK which is characterized as follows. Almost exclusively indoors during the day and occasionally at night Occasional exposure to airborne dust in the work place Work surface is stable (flat) The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. This position is considered LIGHT work. OCCASIONAL FREQUENT CONSTANT Lift up to 20 pounds Climbing, stooping, kneeling, squatting, and reaching Lift up to 10 pounds Standing Repetitive movements of arms and hands Sit with back supported Closing Statement In addition, we make a priority of providing learning and development opportunities to enable employees to achieve their potential and take charge of their future. As well as developing employees in a specific role, we are committed to lifelong learning and ongoing education, including developing people skills and identifying future supervisors and managers. Every month, hundreds of employees are provided training, including HSE awareness, apprenticeships, entry and advanced level technical courses, management development seminars, and leadership and supervisory training. We have a strong ethos of internal promotion. We can offer long-term employment and career advancement across countries and continents. Working at Oceaneering means that if you have the ability, drive, and ambition to take charge of your future-you will be supported to do so and the possibilities are endless. Equal Opportunity/Inclusion Oceaneering’s policy is to provide equal employment opportunity to all applicants. Show more Show less
Posted 6 days ago
8.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Key Responsibilities Lead the overall architecture and development of the Internal Developer Portal Design and implement the core blueprint system and graph based data model Mentor team members and coordinate development efforts Ensure code quality performance and security best practices Communicate with stakeholders and manage project timelines Required Skills 8 years of experience with TypeScript JavaScript development 5 years of experience with React and modern frontend development Strong experience with NestJS or similar Node js frameworks Expert level knowledge of GraphQL API design and implementation Experience with graph databases Neo4j Experience with microservices architecture and event driven systems Experience with Kafka or similar message brokers Strong system design and architectural skills 3 years of experience leading development teams Experience with developer portals internal platforms or similar tools Knowledge of Kubernetes and cloud native technologies Experience with CI CD systems and DevOps practices Desired Skills Familiarity with Restate or similar workflow engines Experience with durable execution patterns Knowledge of enterprise security patterns Experience with high performance data visualization Show more Show less
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
What will you do As a Java Lead you will work as part of a team on the design and implementation of java-based enterprise solutions using agile software development methodologies. You will have the opportunity to take full responsibility for the technical design and implementation of specific business areas using latest frameworks. Job responsibilities: Meeting with technology managers and the design team to discuss the goals and needs of the company. Lead a team consisting of software developers. Capture functional and non-functional requirements and design technical solutions leveraging spring framework. Examining and defining current architecture systems. Designing scalable architecture systems for java-based applications. Efficiently coding and reviewing code also and provide technical guidance and identifying needs to implement or execute to accommodate the company's architectural goals. Contribute to the development process by implementing PoCs and standardizing software delivery adopting devOps practices. Troubleshooting design flaws and system bottlenecks. Performing validation tests, system performance tests and others to ensure the flexibility and scalability of the java environment. Ensure the overall quality and fit of the technical solution in addition to the overall performance of the application stack. Oversee progress of development team to ensure consistency with the initial design, development principles and deadlines. Assisting the software design team with application integration. What are we looking for: Advanced knowledge of development, design, web programming and implementation of software networks. Proficient with Java, Spring boot, Spring Cloud like Configuration management, circuit breaker, security, service discovery, Sleuth, load balancing Should have deep understanding and experience with multithreading. Should be able to create distributed and scalable architecture. Should have the habit of learning and exploring new technologies. Should understand Apache spark, data science, ML & AI. Should have used RDBMS Oracle, MySQL. Knowledge of NoSQL MongoDB, Neo4J, Cassandra. Ability to solve complex software system issues. Ability to clearly present technical information to fellow technical professionals and non-technical peers. Updates job knowledge by participating in educational opportunities, reading professional publications, and participating in professional organizations. Entrepreneurial skills, ability to observe, innovate and own your work. Detail-oriented and organized with strong time management skills. Influencing skills and the ability to create positive working relationships with team members at all levels. Excellent communication and interpersonal skills. Collaborative approach and work with perfection as a group effort to achieve organization goal. Education Qualification - Bachelors degree in software engineering or computer science. Experience - Total Experience: 8 to 12 years, Industry- IT/Software/BFSI/ Banking /Fintech Work arrangement 5 days working from office Location : Noida/Bangalore What do we offer: An organization where we strongly believe in one organization, one goal. A fun workplace which compels us to challenge ourselves and aim higher. A team that strongly believes in collaboration and celebrating success together. Benefits that resonate We Care’. If this opportunity excites you, we invite you to apply and contribute to our success story. If your resume is shortlisted, you will hear back from us. Show more Show less
Posted 6 days ago
6.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
We at MakeMyTrip understand that every traveller is unique and being the leading OTA in India we have the leverage to redefine the travel booking experience to meet their need. If you love to travel and want to be a part of a dynamic team that works on personalizing every user's journey, then look no further. We are looking for a brilliant mind like yours to join our Data Platform team to build exciting data products at a scale where we solve for industry best and fault-tolerant feature stores, real-time data pipelines, catalogs, and much more. Hands-on: Spark, Scala Technologies: Spark, Aerospike, DataBricks, Kafka, Debezium, EMR, Athena, Glue, RocksDB, Redis, Airflow, MySQL, and any other data sources (e.g. Mongo, Neo4J, etc) used by other teams. Location: Gurgaon/Bengaluru Experience: 6+ years Industry Preference: E-Commerce Show more Show less
Posted 6 days ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Develop and maintain Python-based applications and services. Design and query graph databases using Neo4j. Work with PostgreSQL and ClickHouse for efficient data storage and analytics. Build and deploy applications using serverless architecture on AWS, Azure, or GCP. Containerize applications using Docker for streamlined development and deployment. Collaborate with cross-functional teams to define, design, and ship new features. Optimize performance and scalability of backend Skills and Qualifications : Proficient in Python programming with strong coding and debugging skills. Experience with Neo4j and Cypher query language. Familiarity with PostgreSQL and ClickHouse for relational and columnar data storage. Hands-on experience with cloud platforms: AWS, Azure, and GCP. Understanding of serverless computing (e.g., AWS Lambda, Azure Functions, Google Cloud Functions). Proficiency with Docker for containerization and deployment. Strong understanding of software development best practices and version control Qualifications : Experience with CI/CD pipelines and infrastructure as code (e.g., Terraform, CloudFormation). Knowledge of data pipelines, ETL processes, or real-time data streaming. Familiarity with monitoring tools (e.g., Prometheus, Grafana). Bachelor's degree in Computer Science, Engineering, or a related field. (ref:hirist.tech) Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
About The Job Job Description : We are seeking a highly skilled and customer-focused GraphDB / Neo4J Solutions Engineer to join our team. This role is responsible for delivering high-quality solution implementation to our customers to implement GraphDB based product and collaborating with cross-functional teams to ensure customer success. Solution lead is expected to provide in-depth solutions on Data based Software Product to a global client base and partners. This role requires deep technical expertise, strong problem-solving skills, and the ability to communicate complex technical information effectively. Solution lead must have experience working with databases, specifically graph databases, and possess a strong background in Linux, networking, and scripting (bash/python). Roles And Responsibilities Collaborate with core engineering, Customers and solution engineering teams for functional and technical discovery sessions. Prepare product and live software demonstrations Create and maintain public documentation, internal knowledge base articles, and FAQs. Ability to design efficient graph schemas and develop prototypes that address customer requirements (e., Fraud Detection, Recommendation Engines, Knowledge Graphs). Knowledge of indexing strategies, partitioning, and query optimization in GraphDB. Candidate to work during EMEA time zone (2PM to 10 PM shift) Requirements Education and Experience : Education : B.Tech in computer engineering, Information Technology, or related field. Experience : 5+ years of experience in a Solution Lead role on Data based Software Product such as GraphDB, Neo4J Must Have Skills SQL Expertise : 4+ years of experience in SQL for database querying, performance tuning, and debugging. Graph Databases and GraphDB platforms : 4+ years of hands on experience with Neo4j, or similar graph database systems. Scripting & Automation : 4+ years with strong skills in C, C++, Python for automation, task management, and issue resolution. Virtualization and Cloud knowledge : 4+ years with Azure, GCP or AWS. Management skills : 3+ years Experience with data requirements gathering and data modeling, white boarding and developing/validating proposed solution architectures. The ability to communicate complex information and concepts to prospective users in a clear and effective way. Monitoring & Performance Tools : Experience with Grafana, Datadog, Prometheus, or similar tools for system and performance monitoring. Networking & Load Balancing : Proficient in TCP/IP, load balancing strategies, and troubleshooting network-related issues (ref:hirist.tech) Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
About The Job Job Description : We are seeking a highly skilled and customer-focused Technical Support Engineer to join our team. This role is responsible for delivering high-quality technical support to our customers to troubleshoot complex technical issues and collaborating with cross-functional teams to ensure customer success. Technical Support Engineer is expected to provide advanced technical support on Data based Software Product to a global client base and partners. This role requires deep technical expertise, strong problem-solving skills, and the ability to communicate complex technical information effectively. The primary responsibility is to troubleshoot and resolve technical issues, support product adoption, and ensure customer satisfaction. TSE must have experience working with databases, specifically graph databases, and possess a strong background in Linux, networking, and scripting (bash/python). They work collaboratively with engineering teams to escalate and resolve complex issues when necessary (i.e. code change required, first time seeing a behavior). Roles And Responsibilities Respond to customer inquiries and provide in-depth technical support via multiple communication channels. Collaborate with core engineering and solution engineering teams to diagnose and resolve complex technical problems. Create and maintain public documentation, internal knowledge base articles, and FAQs. Monitoring and meeting SLAs. Timely triage varying issues based on error messages, log files, threads dumps, stack traces, sample code, and other available data points. Efficiently troubleshoot cluster issues across multiple servers, data centers, and regions, in a variety of clouds (AWS, Azure, GCP, etc), virtual, and bare metal environments. Candidate to work during EMEA time zone (2PM to 10 PM shift) Requirements Must Have Skills : Education : B.Tech in computer engineering, Information Technology, or related field. Experience GraphDB experience is must 5+ years of experience in a Technical Support Role on Data based Software Product at least L3 level. Linux Expertise : 4+ years with in-depth understanding of Linux, including filesystem, process management, memory management, networking, and security. Graph Databases : 3+ years of experience with Neo4j, or similar graph database systems. SQL Expertise : 3+ years of experience in SQL for database querying, performance tuning, and debugging. Data Streaming & Processing : 2+ years hands-on experience with Kafka, Zookeeper, and Spark. Scripting & Automation : 2+ years with strong skills in Bash scripting and Python for automation, task management, and issue resolution. Containerization & Orchestration : 1+ year proficiency in Docker, Kubernetes, or other containerization technologies is essential. Monitoring & Performance Tools : Experience with Grafana, Datadog, Prometheus, or similar tools for system and performance monitoring. Networking & Load Balancing : Proficient in TCP/IP, load balancing strategies, and troubleshooting network-related issues. Web & API Technologies : Understanding of HTTP, SSL, REST APIs for debugging and troubleshooting API-related issues. Nice To Have Skills Familiarity with Data Science or ML will be an edge. Experience with LDAP, SSO, OAuth authentication. Strong understanding of database internals and system architecture. Cloud certification (at least DevOps Engineer level) (ref:hirist.tech) Show more Show less
Posted 6 days ago
13.0 years
0 Lacs
Andhra Pradesh, India
On-site
Summary about Organization A career in our Advisory Acceleration Center is the natural extension of PwC’s leading global delivery capabilities. The team consists of highly skilled resources that can assist in the areas of helping clients transform their business by adopting technology using bespoke strategy, operating model, processes and planning. You’ll be at the forefront of helping organizations around the globe adopt innovative technology solutions that optimize business processes or enable scalable technology. Our team helps organizations transform their IT infrastructure, modernize applications and data management to help shape the future of business. An essential and strategic part of Advisory's multi-sourced, multi-geography Global Delivery Model, the Acceleration Centers are a dynamic, rapidly growing component of our business. The teams out of these Centers have achieved remarkable results in process quality and delivery capability, resulting in a loyal customer base and a reputation for excellence. . Job Description Senior Data Architect with experience in design, build, and optimization of complex data landscapes and legacy modernization projects. The ideal candidate will have deep expertise in database management, data modeling, cloud data solutions, and ETL (Extract, Transform, Load) processes. This role requires a strong leader capable of guiding data teams and driving the design and implementation of scalable data architectures. Key areas of expertise include Design and implement scalable and efficient data architectures to support business needs. Develop data models (conceptual, logical, and physical) that align with organizational goals. Lead the database design and optimization efforts for structured and unstructured data. Establish ETL pipelines and data integration strategies for seamless data flow. Define data governance policies, including data quality, security, privacy, and compliance. Work closely with engineering, analytics, and business teams to understand requirements and deliver data solutions. Oversee cloud-based data solutions (AWS, Azure, GCP) and modern data warehouses (Snowflake, BigQuery, Redshift). Ensure high availability, disaster recovery, and backup strategies for critical databases. Evaluate and implement emerging data technologies, tools, and frameworks to improve efficiency. Conduct data audits, performance tuning, and troubleshooting to maintain optimal performance Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 13+ years of experience in data modeling, including conceptual, logical, and physical data design. 5 – 8 years of experience in cloud data lake platforms such as AWS Lake Formation, Delta Lake, Snowflake or Google Big Query. Proven experience with NoSQL databases and data modeling techniques for non-relational data. Experience with data warehousing concepts, ETL/ELT processes, and big data frameworks (e.g., Hadoop, Spark). Hands-on experience delivering complex, multi-module projects in diverse technology ecosystems. Strong understanding of data governance, data security, and compliance best practices. Proficiency with data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner). Excellent leadership and communication skills, with a proven ability to manage teams and collaborate with stakeholders. Preferred Skills Experience with modern data architectures, such as data fabric or data mesh. Knowledge of graph databases and modeling for technologies like Neo4j. Proficiency with programming languages like Python, Scala, or Java. Understanding of CI/CD pipelines and DevOps practices in data engineering. Show more Show less
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2