Jobs
Interviews

6030 Scala Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

6 - 9 Lacs

Hyderābād

On-site

Summary As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. About the Role Location – Hyderabad #LI Hybrid About the Role: As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Key Responsibilities: Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Collaborate with cross-functional teams, including data analysts, business analyst and BI, to understand data requirements and design appropriate solutions. Build and maintain data infrastructure in the cloud, ensuring high availability, scalability, and security. Write clean, efficient, and reusable code in scripting languages, such as Python or Scala, to automate data workflows and ETL processes. Implement real-time and batch data processing solutions using streaming technologies like Apache Kafka, Apache Flink, or Apache Spark. Perform data quality checks and ensure data integrity across different data sources and systems. Optimize data pipelines for performance and efficiency, identifying and resolving bottlenecks and performance issues. Collaborate with DevOps teams to deploy, automate, and maintain data platforms and tools. Stay up to date with industry trends, best practices, and emerging technologies in data engineering, scripting, streaming data, and cloud technologies Essential Requirements: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field with an overall experience of 2-4 Years. Proven experience as a Data Engineer or similar role, with a focus on scripting, streaming data pipelines, and cloud technologies like AWS, GCP or Azure. Strong programming and scripting skills in languages like Python, Scala, or SQL. Experience with cloud-based data technologies, such as AWS, Azure, or Google Cloud Platform. Hands-on experience with streaming technologies, such as AWS Streamsets, Apache Kafka, Apache Flink, or Apache Spark Streaming. Strong experience with Snowflake (Required) Proficiency in working with big data frameworks and tools, such as Hadoop, Hive, or HBase. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with data modelling and schema design principles. Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment. Excellent communication and teamwork skills. Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.

Posted 2 days ago

Apply

2.0 years

4 - 10 Lacs

Gurgaon

On-site

Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. Why Join Us? Are you an technologist who is passionate about building robust, scalable, and performant applications & data products? This is exactly what we do, join Data Engineering & Tooling Team! Data Engineering & Tooling Team (part of Enterprise Data Products at Expedia) is responsible for making traveler, partner & supply data accessible, unlocking insights and value! Our Mission is to build and manage the travel industry's premier Data Products and SDKs. Software Development Engineer II Introduction to team Our team is looking for an Software Engineer who applies engineering principles to build & improve existing systems. We follow Agile principles, and we're proud to offer a dynamic, diverse and collaborative environment where you can play an impactful role and build your career. Would you like to be part of a Global Tech company that does Travel? Don't wait, Apply Now! In this role, you will - Implement products and solutions that are highly scalable with high-quality, clean, maintainable, optimized, modular and well-documented code across the technology stack. [OR - Writing code that is clean, maintainable, optimized, modular.] Crafting API's, developing and testing applications and services to ensure they meet design requirements. Work collaboratively with all members of the technical staff and other partners to build and ship outstanding software in a fast-paced environment. Applying knowledge of software design principles and Agile methodologies & tools. Resolve problems and roadblocks as they occur with help from peers or managers. Follow through on details and drive issues to closure. Assist with supporting production systems (investigate issues and working towards resolution). Experience and qualifications: Bachelor's degree or Masters in Computer Science & Engineering, or a related technical field; or equivalent related professional experience. 2+ years of software development or data engineering experience in an enterprise-level engineering environment. Proficient with Object Oriented Programming concepts with a strong understanding of Data Structures, Algorithms, Data Engineering (at scale), and Computer Science fundamentals. Experience with Java, Scala, Spring framework, Micro-service architecture, Orchestration of containerized applications along with a good grasp of OO design with strong design patterns knowledge. Solid understanding of different API types (e.g. REST, GraphQL, gRPC), access patterns and integration. Prior knowledge & experience of NoSQL databases (e.g. ElasticSearch, ScyllaDB, MongoDB). Prior knowledge & experience of big data platforms, batch processing (e.g. Spark, Hive), stream processing (e.g. Kafka, Flink) and cloud-computing platforms such as Amazon Web Services. Knowledge & Understanding of monitoring tools, testing (performance, functional), application debugging & tuning. Good communication skills in written and verbal form with the ability to present information in a clear and concise manner. Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.

Posted 2 days ago

Apply

7.0 years

21 Lacs

Gurgaon

On-site

Job Title: Data Engineer Location: Gurgaon (Onsite) Experience: 7+ Years Employment Type: Contract 6 month Job Description: We are seeking a highly experienced Data Engineer with a strong background in building scalable data solutions using Azure/AWS Databricks , Scala/Python , and Big Data technologies . The ideal candidate should have a solid understanding of data pipeline design, optimization, and cloud-based deployments. Key Responsibilities: Design and build data pipelines and architectures on Azure or AWS Optimize Spark queries and Databricks workloads Manage structured/unstructured data using best practices Implement scalable ETL processes with tools like Airflow, Kafka, and Flume Collaborate with cross-functional teams to understand and deliver data solutions Required Skills: Azure/AWS Databricks Python / Scala / PySpark SQL, RDBMS Hive / HBase / Impala / Parquet Kafka, Flume, Sqoop, Airflow Strong troubleshooting and performance tuning in Spark Qualifications: Bachelor’s degree in IT, Computer Science, Software Engineering, or related Minimum 7+ years of experience in Data Engineering/Analytics Apply Now if you're looking to join a dynamic team working with cutting-edge data technologies! Job Type: Contractual / Temporary Contract length: 6 months Pay: From ₹180,000.00 per month Work Location: In person

Posted 2 days ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Chennai

On-site

3 - 5 Years 5 Openings Bangalore, Chennai, Kochi, Trivandrum Role description Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 – Software Engineering Outcomes: Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 – Software Engineering and Developer 2 – Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 – Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code: Develop code independently for the above Configure: Implement and monitor configuration process Test: Create and review unit test cases scenarios and execution Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Project: Manage module level activities Manage Defects: Perform defect RCA and mitigation Estimate: Estimate time effort resource dependence for one's own work and others' work including modules Document: Create documentation for own work as well as perform peer review of documentation of others' work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Execute release process Design: LLD for multiple components Mentoring: Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: Design, develop, and optimize large-scale data pipelines using Azure Databricks (Apache Spark). Build and maintain ETL/ELT workflows and batch/streaming data pipelines. Collaborate with data analysts, scientists, and business teams to support their data needs. Write efficient PySpark or Scala code for data transformations and performance tuning. Implement CI/CD pipelines for data workflows using Azure DevOps or similar tools. Monitor and troubleshoot data pipelines and jobs in production. Ensure data quality, governance, and security as per organizational standards. Skills Databricks,Adb,Etl About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai

On-site

• Hands-on experience with testing frameworks in line with Web App, Mobile, Web Services/APIs, Network & blockchain. Experience in both commercial and open source tools like: Burp Professional, Nmap, Kali, Metasploit, etc. Experience with Open Web Application Security Project (OWASP), Open Source Security Testing Methodology Manual (OSSTMM) methodologies and tools. Experience in preparing a security threat model and associated test plans. Experience in translating the complex security threats to simpler procedures for web application developers, systems administrators, and management to understand security testing results. In-depth knowledge of application development processes and at least one programing or scripting language (e.g., Java, Scala, C#, Ruby, Perl, Python, PowerShell) is preferred. Knowledge of current information security threats

Posted 2 days ago

Apply

7.0 years

0 Lacs

India

Remote

Role: Neo4j Engineer Overall IT Experience: 7+ years Relevant experience: (Graph Databases: 4+ years, Neo4j: 2+ years) Location: Remote Company Description Bluetick Consultants is a technology-driven firm that supports hiring remote developers, building technology products, and enabling end-to-end digital transformation. With previous experience in top technology companies such as Amazon, Microsoft, and Craftsvilla, we understand the needs of our clients and provide customized solutions. Our team has expertise in emerging technologies, backend and frontend development, cloud development, and mobile technologies. We prioritize staying up-to-date with the latest technological advances to create a long-term impact and grow together with our clients. Key Responsibilities • Graph Database Architecture: Design and implement Neo4j graph database schemas optimized for fund administration data relationships and AI-powered queries • Knowledge Graph Development: Build comprehensive knowledge graphs connecting entities like funds, investors, companies, transactions, legal documents, and market data • Graph-AI Integration: Integrate Neo4j with AI/ML pipelines, particularly for enhanced RAG (Retrieval-Augmented Generation) systems and semantic search capabilities • Complex Relationship Modeling: Model intricate relationships between Limited Partners, General Partners, fund structures, investment flows, and regulatory requirements • Query Optimization: Develop high-performance Cypher queries for real-time analytics, relationship discovery, and pattern recognition • Data Pipeline Integration: Build ETL processes to populate and maintain graph databases from various data sources including FundPanel.io, legal documents, and external market data using domain specific ontologies • Graph Analytics: Implement graph algorithms for fraud detection, risk assessment, relationship scoring, and investment opportunity identification • Performance Tuning: Optimize graph database performance for concurrent users and complex analytical queries • Documentation & Standards: Establish graph modelling standards, query optimization guidelines, and comprehensive technical documentation Key Use Cases You'll Enable • Semantic Search Enhancement: Create knowledge graphs that improve AI search accuracy by understanding entity relationships and context • Investment Network Analysis: Map complex relationships between investors, funds, portfolio companies, and market segments • Compliance Graph Modelling: Model regulatory relationships and fund terms to support automated auditing and compliance validation • Customer Relationship Intelligence: Build relationship graphs for customer relations monitoring and expansion opportunity identification • Predictive Modelling Support: Provide graph-based features for investment prediction and risk assessment models • Document Relationship Mapping: Connect legal documents, contracts, and agreements through entity and relationship extraction Required Qualifications • Bachelor's degree in Computer Science, Data Engineering, or related field • 7+ years of overall IT Experience • 4+ years of experience with graph databases, with 2+ years specifically in Neo4j • Strong background in data modelling, particularly for complex relationship structures • Experience with financial services data and regulatory requirements preferred • Proven experience integrating graph databases with AI/ML systems • Understanding of knowledge graph concepts and semantic technologies • Experience with high-volume, production-scale graph database implementations Technology Skills • Graph Databases: Neo4j (primary), Cypher query language, APOC procedures, Neo4j Graph Data Science library • Programming: Python, Java, or Scala for graph data processing and integration • AI Integration: Experience with graph-enhanced RAG systems, vector embeddings in graph context, GraphRAG implementations • Data Processing: ETL pipelines, data transformation, real-time data streaming (Kafka, Apache Spark) • Cloud Platforms: Neo4j Aura, Azure integration, containerized deployments • APIs: Neo4j drivers, REST APIs, GraphQL integration • Analytics: Graph algorithms (PageRank, community detection, shortest path, centrality measures) • Monitoring: Neo4j monitoring tools, performance profiling, query optimization • Integration: Elasticsearch integration, vector database connections, multi-modal data handling Specific Technical Requirements • Knowledge Graph Construction: Entity resolution, relationship extraction, ontology modelling • Cypher Expertise: Advanced Cypher queries, stored procedures, custom functions • Scalability: Clustering, sharding, horizontal scaling strategies • Security: Graph-level security, role-based access control, data encryption • Version Control: Graph schema versioning, migration strategies • Backup & Recovery: Graph database backup strategies, disaster recovery planning Industry Context Understanding • Fund Administration: Understanding of fund structures, capital calls, distributions, and investor relationships • Financial Compliance: Knowledge of regulatory requirements and audit trails in financial services • Investment Workflows: Understanding of due diligence processes, portfolio management, and investor reporting • Legal Document Structures: Familiarity with LPA documents, subscription agreements, and fund formation documents Collaboration Requirements • AI/ML Team: Work closely with GenAI engineers to optimize graph-based AI applications • Data Architecture Team: Collaborate on overall data architecture and integration strategies • Backend Developers: Integrate graph databases with application APIs and microservices • DevOps Team: Ensure proper deployment, monitoring, and maintenance of graph database infrastructure • Business Stakeholders: Translate business requirements into effective graph models and queries Performance Expectations • Query Performance: Ensure sub-second response times for standard relationship queries • Scalability: Support 100k+ users with concurrent access to graph data • Accuracy: Maintain data consistency and relationship integrity across complex fund structures • Availability: Ensure 99.9% uptime for critical graph database services • Integration Efficiency: Seamless integration with existing FundPanel.io systems and new AI services This role offers the opportunity to work at the intersection of advanced graph technology and artificial intelligence, creating innovative solutions that will transform how fund administrators understand and leverage their data relationships.

Posted 2 days ago

Apply

8.0 years

0 Lacs

India

Remote

Azure Data Engineer Location: Remote Shift : 6am - 3pm US central time zone Job Summary: We are seeking a highly skilled Data Engineer with strong experience in PostgreSQL and SQL Server, as well as hands-on expertise in Azure Data Factory (ADF) and Databricks. The ideal candidate will be responsible for building scalable data pipelines, optimizing database performance, and designing robust data models and schemas to support enterprise data initiatives. Key Responsibilities: Design and develop robust ETL/ELT pipelines using Azure Data Factory and Databricks Develop and optimize complex SQL queries and functions in PostgreSQL Develop and optimize complex SQL queries in SQL Server Perform performance tuning and query optimization for PostgreSQL Design and implement data models and schema structures aligned with business and analytical needs Collaborate with data architects, analysts, and business stakeholders to understand data requirements Ensure data quality, integrity, and security across all data platforms Monitor and troubleshoot data pipeline issues and implement proactive solutions Participate in code reviews, sprint planning, and agile ceremonies Required Skills & Qualifications: 8+ years of experience in data engineering or related field Strong expertise in PostgreSQL and SQL Server development, performance tuning, and schema design Experience in data migration from SQL Server to PostgreSQL Hands-on experience with Azure Data Factory (ADF) and Databricks Proficiency in SQL, Python, or Scala for data processing Experience with data modeling techniques (e.g., star/snowflake schemas, normalization) Familiarity with CI/CD pipelines, version control (Git), and agile methodologies Excellent problem-solving and communication skills If interested, share your resume on aditya.dhumal@leanitcorp.com

Posted 2 days ago

Apply

7.0 years

0 Lacs

Calcutta

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Job Description: · Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. · Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. · Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. · Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. · Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. · Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks · Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained · Working with other members of the project team to support delivery of additional project components (API interfaces) · Evaluating the performance and applicability of multiple tools against customer requirements · Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. · Integrate Databricks with other technologies (Ingestion tools, Visualization tools). · Proven experience working as a data engineer · Highly proficient in using the spark framework (python and/or Scala) · Extensive knowledge of Data Warehousing concepts, strategies, methodologies. · Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). · Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics · Experience in designing and hands-on development in cloud-based analytics solutions. · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. · Designing and building of data pipelines using API ingestion and Streaming ingestion methods. · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. · Thorough understanding of Azure Cloud Infrastructure offerings. · Strong experience in common data warehouse modelling principles including Kimball. · Working knowledge of Python is desirable · Experience developing security models. · Databricks & Azure Big Data Architecture Certification would be plus · Must be team oriented with strong collaboration, prioritization, and adaptability skills required Mandatory skill sets: Azure Databricks Preferred skill sets: Azure Databricks Years of experience required: 7-10 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 days ago

Apply

0.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Associate Job Description & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation’s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true saelves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Architecture Design: · Design and implement scalable, secure, and high-performance architectures for Generative AI applications. · Integrate Generative AI models into existing platforms, ensuring compatibility and performance optimization. Model Development and Deployment: · Fine-tune pre-trained generative models for domain-specific use cases. · Data Collection, Sanitization and Data Preparation strategy for Model fine tuning. · Well versed with machine learning algorithms like Supervised, unsupervised and Reinforcement learnings, Deep learning. · Well versed with ML models like Linear regression, Decision trees, Gradient boosting, Random Forest and K-means etc. · Evaluate, select, and deploy appropriate Generative AI frameworks (e.g., PyTorch, TensorFlow, Crew AI, Autogen, Langraph, Agentic code, Agent flow). Innovation and Strategy: · Stay up to date with the latest advancements in Generative AI and recommend innovative applications to solve complex business problems. · Define and execute the AI strategy roadmap, identifying key opportunities for AI transformation. · Good exposure to Agentic Design patterns Collaboration and Leadership: · Collaborate with cross-functional teams, including data scientists, engineers, and business stakeholders. · Mentor and guide team members on AI/ML best practices and architectural decisions. · Should be able to lead a team of data scientists, GenAI engineers and Software Developers. Performance Optimization: · Monitor the performance of deployed AI models and systems, ensuring robustness and accuracy. · Optimize computational costs and infrastructure utilization for large-scale deployments. Ethical and Responsible AI: · Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks. · Implement safeguards to mitigate bias, misuse, and unintended consequences of Generative AI. Mandatory skill sets: · Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. · Experience with machine learning, artificial Intelligence frameworks models and libraries (TensorFlow, PyTorch, Scikit-learn, etc.). · Should have strong knowledge on LLM’s foundational model (OpenAI GPT4o, O1, Claude, Gemini etc), while need to have strong knowledge on opensource Model’s like Llama 3.2, Phi etc. · Proven track record with event-driven architectures and real-time data processing systems. · Familiarity with Azure DevOps and other LLMOps tools for operationalizing AI workflows. · Deep experience with Azure OpenAI Service and vector DBs, including API integrations, prompt engineering, and model fine-tuning. Or equivalent tech in AWS/GCP. · Knowledge of containerization technologies such as Kubernetes and Docker. · Comprehensive understanding of data lakes and strategies for data management. · Expertise in LLM frameworks including Langchain, Llama Index, and Semantic Kernel. · Proficiency in cloud computing platforms such as Azure or AWS. · Exceptional leadership, problem-solving, and analytical abilities. · Superior communication and collaboration skills, with experience managing high-performing teams. · Ability to operate effectively in a dynamic, fast-paced environment. Preferred skill sets: · Experience with additional technologies such as Datadog, and Splunk. · Programming languages like C#, R, Scala · Possession of relevant solution architecture certificates and continuous professional development in data engineering and Gen AI. Years of experience required: 0-1 Years Education qualification: · BE / B.Tech / MCA / M.Sc / M.E / M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor in Business Administration, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Java Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Description & Requirements Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter. A team where everyone makes play happen. Software Engineer II - Platform Technology – Integrated Content The EA Digital Platform (EADP) organization powers the global EA ecosystem. We provide the foundation for all of EA’s incredible games and player experiences with high-level platforms like Cloud, Commerce, Data and AI, Gameplay Services, Identity and Social. By providing reusable capabilities that game teams can easily integrate into their work, we let them focus on making some of the best games in the world and creating meaningful relationships with our players. We’re behind the curtain, making it all work together. For this role, you will report to an Engineering Manager and get exposed to many technologies. You’ll work as part of a team, engaged with different EADP service teams and game teams around the globe to provide outstanding innovation and performance for our upcoming top games. Responsibilities Develop dynamic, responsive, and resilient systems using Java and Scala, ensuring high performance and innovation for our games. Collaborate closely with team members and cross-functional teams to gather requirements, and contribute to technical design documents, ensuring clarity and feasibility. Employ excellent problem-solving skills to navigate iteratively changing requirements during the development process. Design, implement and maintain moderately-complex systems and features end-to-end. Work in partnership with service teams and game development teams to identify and implement solutions and workflow enhancements. Contribute to the implementation, maintenance, and evolution of automated testing across a distributed service stack, ensuring reliability and quality. Provide support to manage and resolve integration issues and live incidents, ensuring smooth operations and game experiences. Qualifications Bachelor's or Master's degree in Computer Science, Computer Engineering, or a related field, showcasing a strong foundation in software engineering principles. 3+ years of industry experience Exposure to a variety of programming paradigms, including both imperative and functional programming languages, demonstrating versatility and a willingness to learn. Experience with Java, Scala, Kotlin, JavaScript/Node, Clojure Ability to work effectively within a team in an agile and iterative development environment, showcasing adaptability and collaborative skills. Strong communication skills, both written and verbal, capable of conveying complex technical concepts in a clear and understandable manner. You are a motivated individual with a passion for gaming and technology, eager to learn and grow within the field of software engineering. About Electronic Arts We’re proud to have an extensive portfolio of games and experiences, locations around the world, and opportunities across EA. We value adaptability, resilience, creativity, and curiosity. From leadership that brings out your potential, to creating space for learning and experimenting, we empower you to do great work and pursue opportunities for growth. We adopt a holistic approach to our benefits programs, emphasizing physical, emotional, financial, career, and community wellness to support a balanced life. Our packages are tailored to meet local needs and may include healthcare coverage, mental well-being support, retirement savings, paid time off, family leaves, complimentary games, and more. We nurture environments where our teams can always bring their best to what they do. Electronic Arts is an equal opportunity employer. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law. We will also consider employment qualified applicants with criminal records in accordance with applicable law. EA also makes workplace accommodations for qualified individuals with disabilities as required by applicable law.

Posted 2 days ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Jagatpura, Jaipur, Rajasthan

On-site

AWS Data Engineer Location: Jaipur Mode: On-site Experience: 2+ Years The Role Zynsera is looking for a talented AWS Data Engineer to join our dynamic team! If you have a strong grasp of AWS services, serverless data pipelines, and Infrastructure as Code — let’s connect. As an AWS Data Engineer at Zynsera, you will: Develop and optimize data pipelines using AWS Glue, Lambda, and Athena Build infrastructure using AWS CDK for automation and scalability Manage structured and semi-structured data with AWS Lakehouse & Iceberg Design serverless architectures for batch and streaming workloads Collaborate with senior engineers to drive performance and innovation You're a Great Fit If You Have: Proficiency in AWS Glue, Lambda, Athena, and Lakehouse architecture Experience with CDK, Python, PySpark, Spark SQL, or Java/Scala Familiarity with data lakes, data warehousing, and scalable cloud solutions (Bonus) Knowledge of Firehose, Kinesis, Apache Iceberg, or DynamoDB Job Types: Full-time, Permanent Pay: ₹25,316.90 - ₹45,796.55 per month Ability to commute/relocate: Jagatpura, Jaipur, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Experience: AWS Data Engineer: 1 year (Required) Work Location: In person

Posted 2 days ago

Apply

18.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Job Summary: We’re looking for a strategic and technically strong Senior Data Architect to join our high-growth digital team. The selected person will play a critical role in shaping the company’s global data architecture and vision. The ideal candidate will lead enterprise-level architecture initiatives, collaborate with engineering and business teams, and guide a growing team of engineers and QA professionals. This role involves deep engagement across domains including Marketing, Product, Finance, and Supply Chain, with a special focus on marketing technology and commercial analytics relevant to the CPG/FMCG industry. The candidate should bring a hands-on mindset, a proven track record in designing scalable data platforms, and the ability to lead through influence. An understanding of industry-standard frameworks (e.g., TOGAF), tools like CDPs, MMM platforms, and AI-based insights generation will be a strong plus. Curiosity, communication, and architectural leadership are essential to succeed in this role. Key Responsibilities Enterprise Data Strategy: Design, define and maintain a holistic data strategy & roadmap that aligns with corporate objectives and fuels digital transformation. Ensure data architecture and products aligns with enterprise standards and best practices Data Governance & Quality: Establish scalable governance frameworks to ensure data accuracy, privacy, security, and compliance (e.g., GDPR, CCPA). Oversee quality, security and compliance initiatives Data Architecture & Platforms: Oversee modern data infrastructure (e.g., data lakes, warehouses, streaming) with technologies like Snowflake, Databricks, AWS, and Kafka Marketing Technology Integration: Ensure data architecture supports marketing technologies and commercial analytics platforms (e.g., CDP, MMM, ProfitSphere) tailored to the CPG/FMCG industry Architectural Leadership: Act as a hands-on architect with the ability to lead through influence. Guide design decisions aligned with industry best practices and e.l.f.'s evolving architecture roadmap Cross-Functional Collaboration: Partner with Marketing, Supply Chain, Finance, R&D, and IT to embed data-driven practices and deliver business impact. Lead integration of data from multiple sources to unified data warehouse. Cloud Optimization : Optimize data flows, storage for performance and scalability. Lead data migration priorities, manage metadata repositories and data dictionaries. Optimise databases and pipelines for efficiency. Manage and track quality, cataloging and observability AI/ML Enablement: Drive initiatives to operationalize predictive analytics, personalization, demand forecasting, and more using AI/ML models. Evaluate emerging data technologies and tools to improve data architecture Team Leadership: Lead, mentor, and enable high-performing team of data engineers, analysts, and partners through influence and thought leadership Vendor & Tooling Strategy: Manage relationships with external partners and drive evaluations of data and analytics tools Executive Reporting: Provide regular updates and strategic recommendations to executive leadership and key stakeholders Data Enablement : Design data models, database structures, and data integration solutions to support large volumes of data Qualifications And Requirements Bachelor's or Master's degree in Computer Science, Information Systems, or a related field 18+ years of experience in Information Technology 8+ years of experience in data architecture, data engineering, or a related field, with a focus on large-scale, distributed systems Strong understanding of data use cases in the CPG/FMCG sector. Experience with tools such as MMM (Marketing Mix Modeling), CDPs, ProfitSphere, or inventory analytics preferred Awareness of architecture frameworks like TOGAF. Certifications are not mandatory, but candidates must demonstrate clear thinking and experience in applying architecture principles Must possess excellent communication skills and a proven ability to work cross-functionally across global teams. Should be capable of leading with influence, not just execution Knowledge of data warehousing, ETL/ELT processes, and data modeling Deep understanding of data modeling principles, including schema design and dimensional data modeling Strong SQL development experience including SQL Queries and stored procedures Ability to architect and develop scalable data solutions, staying ahead of industry trends and integrating best practices in data engineering Familiarity with data security and governance best practices Experience with cloud computing platforms such as Snowflake, AWS, Azure, or GCP Excellent problem-solving abilities with a focus on data analysis and interpretation Strong communication and collaboration skills Ability to translate complex technical concepts into actionable business strategies Proficiency in one or more programming languages such as Python, Java, or Scala This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.

Posted 2 days ago

Apply

4.0 years

0 Lacs

India

On-site

Mandate- Adv SQL & ETL exp-4+ Years KNOWLEDGE AND EXPERIENCE • 4 to 8 years relevant work experience in software testing primarily on Database /ETL and exposure towards Big Data Testing • Hands on experience in Testing Big Data Application on: Azure , Cloudera • Understanding of more query languages: Pig, HiveQL, etc. • Excellent skills in writing SQL queries and good knowledge on database [Oracle/ Netezza/SQL ] • Handson on any one scripting languages: Java/Scala/Python etc. • Good to have experience in Unix shell scripting • Experience in Agile development, knowledge on using Jira • Good analytical skills and communications skills. • Prior Health Care industry experience is a plus. • Flexible to work / Adopt quickly with different technologies and tools

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Scala Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior professionals to enhance their skills and knowledge. - Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language). - Good To Have Skills: Experience with Scala. - Strong understanding of application development methodologies. - Experience with software development life cycle and agile practices. - Familiarity with database management and data integration techniques. Additional Information: - The candidate should have minimum 7.5 years of experience in Python (Programming Language). - This position is based in Pune. - A 15 years full time education is required.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Scala Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior professionals to enhance their skills and knowledge. - Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language). - Good To Have Skills: Experience with Scala. - Strong understanding of application development methodologies. - Experience with software development life cycle and agile practices. - Familiarity with database management and data integration techniques. Additional Information: - The candidate should have minimum 7.5 years of experience in Python (Programming Language). - This position is based in Pune. - A 15 years full time education is required.

Posted 2 days ago

Apply

12.0 years

0 Lacs

India

On-site

We are seeking a highly skilled and experienced AWS Architect with a strong background in Data Engineering and expertise in Generative AI. In this pivotal role, you will be responsible for designing, building, and optimizing scalable, secure, and cost-effective data solutions that leverage the power of AWS services, with a particular focus on integrating and managing Generative AI capabilities. The ideal candidate will possess a deep understanding of data architecture principles, big data technologies, and the latest advancements in Generative AI, including Large Language Models (LLMs) and Retrieval Augmented Generation (RAG). You will work closely with data scientists, machine learning engineers, and business stakeholders to translate complex requirements into robust and innovative solutions on the AWS platform. Responsibilities: • Architect and Design: Lead the design and architecture of end-to-end data platforms and pipelines on AWS, incorporating best practices for scalability, reliability, security, and cost optimization. • Generative AI Integration: Architect and implement Generative AI solutions using AWS services like Amazon Bedrock, Amazon SageMaker, Amazon Q, and other relevant technologies. This includes designing RAG architectures, prompt engineering strategies, and fine-tuning models with proprietary data (knowledge base). • Data Engineering Expertise: Design, build, and optimize ETL/ELT processes for large-scale data ingestion, transformation, and storage using AWS services such as AWS Glue, Amazon S3, Amazon Redshift, Amazon Athena, Amazon EKS and Amazon EMR. • Data Analytics: Design, build, and optimize analytical solutions for large-scale data ingestion, analytics and insights using AWS services such as AWS Quicksight • Data Governance and Security: Implement robust data governance, data quality, and security measures, ensuring compliance with relevant regulations and industry best practices for both traditional data and Generative AI applications. • Performance Optimization: Identify and resolve performance bottlenecks in data pipelines and Generative AI workloads, ensuring efficient resource utilization and optimal response times. • Technical Leadership: Act as a subject matter expert and provide technical guidance to data engineers, data scientists, and other team members. Mentor and educate on AWS data and Generative AI best practices. • Collaboration: Work closely with cross-functional teams, including product owners, data scientists, and business analysts, to understand requirements and deliver impactful solutions. • Innovation and Research: Stay up-to-date with the latest AWS services, data engineering trends, and advancements in Generative AI, evaluating and recommending new technologies to enhance our capabilities. • Documentation: Create comprehensive technical documentation, including architectural diagrams, design specifications, and operational procedures. • Cost Management: Monitor and optimize AWS infrastructure costs related to data and Generative AI workloads. Required Skills and Qualifications: • 12+ years of experience in data engineering, data warehousing, or big data architecture. • 5+ years of experience in an AWS Architect role, specifically with a focus on data. • Proven experience designing and implementing scalable data solutions on AWS. • Strong hands-on experience with core AWS data services, including: o Data Storage: Amazon S3, Amazon Redshift, Amazon DynamoDB, Amazon RDS o Data Processing: AWS Glue, Amazon EMR, Amazon EKS, AWS Lambda, Informatica o Data Analytic: Amazon Quicksight, Amazon Athena, Tableau o Data Streaming: Amazon Kinesis, AWS MSK o Data Lake: AWS Lake Formation • Strong competencies in Generative AI, including: o Experience with Large Language Models (LLMs) and Foundation Models (FMs). o Hands-on experience with Amazon Bedrock (including model customization, agents, and orchestrations). o Understanding and experience with Retrieval Augmented Generation (RAG) architectures and vector databases (e.g., Amazon OpenSearch Service for vector indexing). o Experience with prompt engineering and optimizing model responses. o Familiarity with Amazon SageMaker for building, training, and deploying custom ML/Generative AI models. o Knowledge of Amazon Q for business-specific Generative AI applications. • Proficiency in programming languages such as Python (essential), SQL, and potentially Scala or Java. • Experience with MLOps/GenAIOps principles and tools for deploying and managing Generative AI models in production. • Solid understanding of data modeling, data warehousing concepts, and data lake architectures. • Experience with CI/CD pipelines and DevOps practices on AWS. • Excellent communication, interpersonal, and presentation skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. • Strong problem-solving and analytical abilities. Preferred Qualifications: • AWS Certified Solutions Architect – Professional or AWS Certified Data Engineer – Associate/Specialty. • Experience with other Generative AI frameworks (e.g., LangChain) or open-source LLMs. • Familiarity with containerization technologies like Docker and Kubernetes (Amazon EKS). • Experience with data transformation tools like Informatica, Matillion • Experience with data visualization tools (e.g., Amazon QuickSight, Tableau, Power BI). • Knowledge of data governance tools like Amazon DataZone. • Experience in a highly regulated industry (e.g., Financial Services, Healthcare).

Posted 2 days ago

Apply

7.0 years

0 Lacs

Delhi, India

On-site

We are looking for a skilled Scala Developer with at least 7+ years of professional experience in building scalable, high-performance backend applications. The ideal candidate should have a strong grasp of functional programming, data processing frameworks, and cloud-based environments. Key Responsibilities:Design, develop, test, and deploy backend services and APIs using Scala.Collaborate with cross-functional teams including product managers, frontend developers, and QA engineers.Optimize and maintain existing codebases, ensuring performance, scalability, and reliability.Write clean, well-documented, and testable code following best practices.Work with tools and technologies like Akka, Play Framework, and Kafka.Participate in code reviews, knowledge sharing, and mentoring junior developers.Integrate with SQL/NoSQL databases and third-party APIs.Build and maintain data pipelines using Spark or similar tools (if required).

Posted 2 days ago

Apply

10.0 years

0 Lacs

Delhi, India

On-site

YOE : 10 YEARS TO 15 YEARS SKILLS REQUIRED : Java, python, HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, HLD, LLD, SQL, NOSQL, MongoDB, etc PREFERENCE : Tier 1 college/universities Role & Responsibilities Lead and mentor a team of data engineers, ensuring high performance and career growth. Architect and optimize scalable data infrastructure, ensuring high availability and reliability. Drive the development and implementation of data governance frameworks and best practices. Work closely with cross-functional teams to define and execute a data roadmap. Optimize data processing workflows for performance and cost efficiency. Ensure data security, compliance, and quality across all data platforms. Foster a culture of innovation and technical excellence within the data team. Ideal Candidate Candidates from TIER 1 college preferred MUST have Experience in Product startups , and should have implemented Data Engineering systems from an early stage in the Company MUST have 10+ years of experience in software/data engineering, with at least 3+ years in a leadership role. MUST have Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. MUST be Proficiency in SQL, Python, and Scala for data processing and analytics. Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services. MUST have Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice MUST have Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks. Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.). Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB. Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy and align it with business objectives. Strong leadership, communication, and stakeholder management skills. Candidates from TIER 1 college preferred Preferred Qualifications: Experience in machine learning infrastructure or MLOps is a plus. Exposure to real-time data processing and analytics. Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role We’re on a mission to completely change the way healthcare works by building the most powerful Healthcare Intelligence Platform (Gravity) ever made. Using an AI-first approach , our goal is to turn complicated health data into real-time insights that help hospitals, clinics, pharmaceutical companies, and researchers make faster, smarter decisions. We're building a unified platform from the ground up — specifically for healthcare . This platform will bring together everything from: Collecting data from different systems (Data Acquisition) Combining and cleaning it (Integration, Data Quality) Managing patient records and provider info (Master Data Management) Tagging and organizing it (Data Classification & Governance) Running analytics and building AI models (Analytics, AI Studio) Creating custom healthcare apps (App Marketplace) Using AI as a built-in assistant (AI as BI + Agent-first approach) This platform will let healthcare teams build solutions quickly — without starting from scratch each time. For example, they’ll be able to: Track and manage kidney disease patients across different hospitals Speed up clinical trials by analyzing real-world patient data Help pharmacies manage their stock better with predictive supply chain tools Detect early signs of diseases like diabetes or cancer with machine learning Ensure regulatory compliance automatically through built-in checks This is a huge, complex, and high-impact challenge , and we’re looking for a Staff Engineer to help lead the way. In this role, you’ll: Design and build scalable, secure, and reliable systems Create core features like data quality checks , metadata management , data lineage tracking , and privacy/compliance layers Work closely with other engineers, product managers, and healthcare experts to bring the platform to life If you're passionate about using technology to make a real difference in the world — and enjoy solving big engineering problems — we'd love to connect. A Day in the Life Architect, design, and build scalable data tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of tooling – from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering with strong experience building distributed systems. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Bachelor's or Master’s degree in Computer Science, Engineering, or a related field. Here’s What We Offer Generous Leaves: Enjoy generous leave benefits of up to 40 days. Parental Leave : Leverage one of industry's best parental leave policies to spend time with your new addition. Sabbatical : Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer comprehensive health insurance to support you and your family, covering medical expenses related to illness, disease, or injury. Extending support to the family members who matter most. Care Program: Whether it’s a celebration or a time of need, we’ve got you covered with care vouchers to mark major life events. Through our Care Vouchers program, employees receive thoughtful gestures for significant personal milestones and moments of need. Financial Assistance : Life happens, and when it does, we’re here to help. Our financial assistance policy offers support through salary advances and personal loans for genuine personal needs, ensuring help is there when you need it most. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer activates the flow of healthcare data, empowering providers, payers, and government organizations to deliver intelligent and connected experiences that advance health outcomes. The Healthcare Intelligence Cloud equips every stakeholder in the patient journey to turn fragmented data into proactive, coordinated actions that elevate the quality of care and drive operational performance. Leading healthcare organizations like CommonSpirit Health, Atlantic Health, and Banner Health trust Innovaccer to integrate a system of intelligence into their existing infrastructure, extending the human touch in healthcare. For more information, visit www.innovaccer.com. Check us out on YouTube , Glassdoor , LinkedIn , Instagram , and the Web .

Posted 2 days ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Job Title: Azure Data Engineer Location: Noida Sec-132 Job Description: 1. Strong experience in Azure - Azussre Data Factory, Azure Data Lake, Azure Data bricks 2. Good at Cosmos DB, Azure SQL data warehouse/synapse 3. Excellent in data ingestion (Batch and real-time processing) 4. Good understanding of synapse workspace and synapse analytics 5. Good hands-on experience on Pyspark or Scala spark 6. Good hands-on experience on Delta Lake and Spark streaming 7. Good Understanding of Azure DevOps and Azure Infrastructure concepts 8. Have at least one project end-to-end hands-on implementation experience as an architect 9. Expert and persuasive communication skills (verbal and written) 10. Expert in presentation and skilled at managing multiple clients. 11. Good at Python / Shell Scripting 12. Good to have Azure/any cloud certifications.

Posted 2 days ago

Apply

7.0 years

0 Lacs

India

Remote

Job Title: Scala Developer Experience: 7+ Years Location: Remote Employment Type: Full-Time Job Summary We are seeking an experienced Scala Developer with 7+ years of expertise in building scalable and high-performance backend applications. The ideal candidate should have a solid understanding of functional programming principles , distributed systems, and cloud-native environments. You will play a key role in designing and implementing robust backend services and collaborating closely with cross-functional teams. Key Responsibilities Design, develop, test, and deploy backend services and APIs using Scala. Collaborate with product managers, frontend developers, and QA teams to deliver high-quality software. Optimize and maintain existing codebases with a focus on performance, scalability, and reliability . Write clean, maintainable, and testable code , adhering to coding best practices and standards. Utilize tools and frameworks such as Akka, Play Framework, Kafka , and other Scala ecosystem technologies. Conduct code reviews , share knowledge, and mentor junior team members. Work with SQL/NoSQL databases and integrate third-party APIs. Build and maintain data processing pipelines using Spark or similar tools (if required). Key Skills and Qualifications 7+ years of experience in backend development with Scala . Strong knowledge of functional programming concepts. Hands-on experience with Akka, Play Framework, Kafka, Spark (preferred). Proficiency in working with databases (SQL and NoSQL) . Familiarity with cloud platforms (AWS, GCP, or Azure). Solid understanding of API design, distributed systems, and microservices architecture. Strong problem-solving and debugging skills. Excellent collaboration and communication abilities.

Posted 2 days ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title: Spark/Python/Pentaho Developer Location: Pune, India Role Description Spark/Python/Pentaho Developer. Need to work on Data Integration project. Mostly batch oriented using Python/Pyspark/Pentaho. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Hands on Spark/Python/Pentaho programming experience Participating in agile development projects for batch data ingestion. Fast learner into order to understand the current data landscape and existing Python/Spark/Pentaho program to make enhancement. Stakeholder communication Contribute to all stages of software development lifecycle Analyze user requirements to define business objectives Define application objectives and functionality Develop and test software Identify and resolve any technical issues arising Create detailed design documentation Conducting software analysis, programming, testing, and debugging Software upgrades and maintenance Migration of Out of Support Application Software Your Skills And Experience Experience: Minimum 5-10 years Spark Python programming Pentaho Good in writing Hive HQL’s / SQLs Oracle database Java/Scala experience is a plus Expertise in unit testing. Know-how with cloud-based infrastructure How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Associate Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Associate Software Engineer - Full Stack (INSE35) Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Full Stack Software Engineers who are passionate about marrying data with emerging technologies. As a Capital One Software Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies. Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community. Utilize various programming languages, Open Source technologies, Relational/NoSQL databases, Containers and a variety of AWS tools and services with a special focus on Serverless initiatives. Implement automated tests to support continuous delivery while monitoring and troubleshooting basic issues with established deployment pipelines. Experience implementing uncomplicated software according to clearly specified designs and apply design patterns effectively when instructed. Debug nominal issues across backend, frontend, middleware, infrastructure, databases and pipelines across all environments from local to production in a reasonable timeframe. Develop features to enhance the overall user experience based on wireframes and detailed design mockups. Basic Qualifications: Bachelor’s degree At least 18 months of experience building back-end services (Including but not limited to: Java, JavaScript, Python, Go, Node, Scala, TypeScript, Spring Boot) (Internship experience does not apply) At least 6 months of experience with a front-end language (Including but not limited to: JavaScript, TypeScript) (Internship experience does not apply) At least 6 months of experience with a database technology (Including but not limited to: MySQL, PostgreSQL, MongoDB, Redis, Cassandra, DynamoDB) (Internship experience does not apply) At least 6 months of experience designing, building, and testing distributed systems (Internship experience does not apply) Preferred Qualifications: Experience in Agile or Kanban software development methodologies and Agile software development 1+ year of experience with a cloud computing provider (AWS, Microsoft Azure, Google Cloud) 1+ year of experience implementing functional tests, unit tests, integrated tests or automated tests to support CICD 1+ year of experience with a server side application framework (Django, Express, Spring) 1+ year of experience with a UI framework (Angular, Vue, React) 1+ year of experience building and testing software 1+ year of experience with high level design (HLD) **At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Manager Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Manager, Data Engineering Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative,inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 4 years of experience in application development (Internship experience does not apply) At least 2 years of experience in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) At least 2 years of people management experience Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra) 4+ years of data warehousing experience (Redshift or Snowflake) 4+ years of experience with UNIX/Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Sr. Manager Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Software Engineer Full Stack - Senior Manager (INSE56) Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Full Stack Software Engineers who are passionate about marrying data with emerging technologies. As a Capital One Software Engineering Senior Manager, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Lead a portfolio of diverse technology projects and a team of developers with deep experience in distributed microservices and full stack systems Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community, and from time to time, be asked to code or evaluate code Utilize various programming languages, Open Source technologies, Relational/NoSQL databases, Containers and a variety of AWS tools and services with a special focus on Serverless initiatives. Implement complex software solutions with little to no support in ambiguous problem spaces, while showing mastery of identifying and creating patterns and practices, sharing those practices with others. Formulate and execute on the delivery of an end to end continuous delivery strategy design and contributes new pipeline capabilities to the enterprise, while quickly troubleshooting the most complicated issues with local builds and deployment pipelines Apply modern application architecture and design patterns to the most ambiguous and complex domains, while developing new architectural and design patterns as needed to solve the most challenging issues Debug the most complex issues across backend, frontend, middleware, infrastructure, databases and pipelines across all environments from local to production and across distributed systems Quickly identify root cause, implement fix, correct any affected data, and implement process changes to prevent similar issues in the future Develop the most complex features to enhance the overall user experience from business goals Lead team(s) to collaboratively develop distributed feature sets by providing technical direction or detailed instruction Basic Qualifications: Bachelor’s degree At least 6 years of experience building back-end services (Including but not limited to: Java, JavaScript, Python, Go, Node, Scala, TypeScript, Spring Boot) (Internship experience does not apply) At least 3 years of experience with a front-end language (Including but not limited to: JavaScript, TypeScript) (Internship experience does not apply) At least 3 years of experience with a database technology (Including but not limited to: MySQL, PostgreSQL, MongoDB, Redis, Cassandra, DynamoDB) (Internship experience does not apply) At least 3 years of experience with a UI framework (Including but not limited to: Angular, Vue, React) (Internship experience does not apply) At least 3 years of experience designing, building, and testing distributed systems (Internship experience does not apply) At least 3 years of experience in a technical leadership role overseeing strategic projects At least 3 years of experience in a people management position Preferred Qualifications: Experience in Agile or Kanban software development methodologies and Agile software development 6+ years of experience with a cloud computing provider (AWS, Microsoft Azure, Google Cloud) 6+ years of experience implementing functional tests, unit tests, integrated tests or automated tests to support CICD 6+ years of experience with a server side application framework (Django, Express, Spring) 6+ years of experience building and testing software 6+ years of experience with high level design (HLD) **Capital One will not consider sponsoring a new qualified applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies