Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
16 - 22 Lacs
Hyderabad
Work from Office
Looking for a Data Engineer with 8+ yrs exp to build scalable data pipelines on AWS/Azure, work with Big Data tools (Spark, Kafka), and support analytics teams. Must have strong coding skills in Python/Java and exp with SQL/NoSQL & cloud platforms. Required Candidate profile Strong experience in Java/Scala/Python. Worked with big data tech: Spark, Kafka, Flink, etc. Built real-time & batch data pipelines. Cloud: AWS, Azure, or GCP.
Posted 1 week ago
11.0 - 18.0 years
9 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Responsibilities: Understand architecture requirements and ensure effective design, development, validation, and support activities. Analyze user requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development. Contribute to team meetings, troubleshooting development and production problems across multiple environments and operating platforms. Ensure effective design, development, validation, and support activities for Big Data solutions. Technical and Professional Requirements: Skills: Proficiency in Scala, Spark, Hive, and Kafka. In-depth knowledge of design issues and best practices. Solid understanding of object-oriented programming. Familiarity with various design, architectural patterns, and software development processes. Experience with both external and embedded databases. Creating database schemas that represent and support business processes. Implementing automated testing platforms and unit tests. Preferred Skills: Technology -> Big Data -> Scala, Spark, Hive, Kafka Additional Responsibilities: Competencies: Good verbal and written communication skills. Ability to communicate with remote teams effectively. High flexibility to travel. Educational Requirements: Master of Computer Applications, Master of Technology, Master of Engineering, MSc, Bachelor of Technology, Bachelor of Computer Applications, Bachelor of Computer Science, Bachelor of Engineering
Posted 1 week ago
6.0 - 11.0 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Responsibilities: Understand architecture requirements and ensure effective design, development, validation, and support activities. Analyze user requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development. Contribute to team meetings, troubleshooting development and production problems across multiple environments and operating platforms. Ensure effective design, development, validation, and support activities for Big Data solutions. Technical and Professional Requirements: Skills: Proficiency in Scala, Spark, Hive, and Kafka. In-depth knowledge of design issues and best practices. Solid understanding of object-oriented programming. Familiarity with various design, architectural patterns, and software development processes. Experience with both external and embedded databases. Creating database schemas that represent and support business processes. Implementing automated testing platforms and unit tests. Preferred Skills: Technology -> Big Data -> Scala, Spark, Hive, Kafka Additional Responsibilities: Competencies: Good verbal and written communication skills. Ability to communicate with remote teams effectively. High flexibility to travel. Educational Requirements: Master of Computer Applications, Master of Technology, Master of Engineering, MSc, Bachelor of Technology, Bachelor of Computer Applications, Bachelor of Computer Science, Bachelor of Engineering
Posted 1 week ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are looking for a Big Data Developer to build and maintain scalable data processing systems. The ideal candidate will have experience handling large datasets and working with distributed computing frameworks. Key Responsibilities: Design and develop data pipelines using Hadoop, Spark, or Flink. Optimize big data applications for performance and reliability. Integrate various structured and unstructured data sources. Work with data scientists and analysts to prepare datasets. Ensure data quality, security, and lineage across platforms. Required Skills & Qualifications: Experience with Hadoop ecosystem (HDFS, Hive, Pig) and Apache Spark. Proficiency in Java, Scala, or Python. Familiarity with data ingestion tools (Kafka, Sqoop, NiFi). Strong understanding of distributed computing principles. Knowledge of cloud-based big data services (e.g., EMR, Dataproc, HDInsight). Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Surat
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Varanasi
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Visakhapatnam
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Surat
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Varanasi
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Visakhapatnam
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications.
Posted 1 week ago
5.0 - 7.0 years
25 - 40 Lacs
Bengaluru
Work from Office
Roles and Responsibilities: Ensure the ongoing stability, scalability, and performance of PhonePes Hadoop ecosystem and associated services. Exhibit a high level of ownership and accountability that ensures reliability of the Distributed clusters. Manage and administer Hadoop infrastructure including Apache Hadoop,HDFS, HBase, Hive, Pig, Airflow, YARN, Ranger, Kafka, Pinot,Ozone and Druid. Automate BAU operations through scripting and tool development. Perform capacity planning, system tuning, and performance optimization. Set-up, configure, and manage Nginx in high-traffic environments. Administration and troubleshooting of Linux + Bigdata systems, including networking (IP, Iptables, IPsec). Handle on-call responsibilities, investigate incidents, perform root cause analysis, and implement mitigation strategies. Collaborate with infrastructure, network, database, and BI teams to ensure data availability and quality. Apply system updates, patches, and manage version upgrades in coordination with security teams. Build tools and services to improve observability, debuggability, and supportability. Enabling cluster security using Kerberos and LDAP. Experience in capacity planning and performance tuning of Hadoop clusters. Work with configuration management and deployment tools like Puppet, Chef, Salt, or Ansible. Preferred candidate profile: Minimum 1 year of Linux/Unix system administration experience. Over 4 years of hands-on experience in Apache Hadoop administration. Minimum 1 years of experience managing infrastructure on public cloud platforms like AWS, Azure, or GCP (optional ) . Strong understanding of networking, open-source tools, and IT operations. Proficient in scripting and programming (Perl, Golang, or Python). Hands-on experience with maintaining and managing the Hadoop ecosystem components like HDFS, Yarn, Hbase, Kafka . Strong operational knowledge in systems (CPU, memory, storage, OS-level troubleshooting). Experience in administering and tuning relational and NoSQL databases. Experience in configuring and managing Nginx in production environments. Excellent communication and collaboration skills. Good to Have Experience designing and maintaining Airflow DAGs to automate scalable and efficient workflows. Experience in ELK stack administration. Familiarity with monitoring tools like Grafana, Loki, Prometheus, and OpenTSDB. Exposure to security protocols and tools (Kerberos, LDAP). Familiarity with distributed systems like elasticsearch or similar high-scale environments
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Lucknow
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Ludhiana
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Ludhiana
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Lucknow
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
15.0 - 20.0 years
0 Lacs
karnataka
On-site
As the Chief Technology Officer for our client, you will lead the company's technical vision, oversee all aspects of technology development, and ensure the delivery of highly scalable applications. Working closely with the CEO and other business leaders, you will be responsible for developing, leading, and evolving a new-generation media platform. Your role will be instrumental in scaling and growing the business, defining strategies, and creating roadmaps for delivering cutting-edge products. We are seeking a candidate with 15-20 years of experience in Online Media/New Gen Media, Internet & Technology domains. You should have a strong background in building both B2B and B2C customer-centric products and software at a large scale, as well as developing technical capabilities to drive innovation and product differentiation. A successful track record of building high-performing teams that prioritize innovation, intellectual property, and collaboration is essential. The ideal candidate should have held positions such as Senior Director of Technology, Senior VP IT, Head of Product & Engineering - IT, or CTO in their current organization. Deep knowledge of full-stack modern development practices including node.js, angular, java, javascript, AWS/Azure, DS, flutter, and react is required. Hands-on experience in digital initiatives, big data, mobile apps, AI & ML, analytics, and business intelligence solutions will be advantageous. If you are ready to take on this exciting opportunity and further your career in technology leadership, we invite you to connect with us and explore this role further.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. EY's financial services practice offers integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY's Consulting Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for a candidate with 10-12 years of expertise in data science, data analysis, and visualization skills. Act as a Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects. Your key responsibilities include: - Understanding of insurance domain knowledge (PnC or life or both) - Being responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL) - Overseeing and governing the expansion of existing data architecture and the optimization of data query performance via best practices - Working independently and collaboratively - Implementing business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning) - Working with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Defining and governing data modeling and design standards, tools, best practices, and related development for enterprise data models - Identifying the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC - Working proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills and attributes for success include: - Strong communication, presentation, and team-building skills - Experience in executing and managing research and analysis of companies and markets - BE/BTech/MCA/MBA with 8 - 12 years of industry experience with machine learning, visualization, data science, and related offerings - At least around 4-8 years of experience in BI and Analytics - Ability to do end-to-end data solutions from analysis, mapping, profiling, ETL architecture, and data modeling - Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality - Good experience using CA Erwin or other similar modeling tools - Strong knowledge of relational and dimensional data modeling concepts - Experience in data management analysis - Experience with unstructured data is an added advantage - Ability to effectively visualize and communicate analysis results - Experience with big data and cloud preferred - Experience, interest, and adaptability to working in an Agile delivery environment. Ideally, you'll also have: - Good exposure to any ETL tools - Good to have knowledge about P&C insurance - Must have led a team size of at least 4 members - Experience in Insurance and Banking domain - Prior client-facing skills, self-motivated, and collaborative. What we look for: A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from engaging colleagues - Opportunities to develop new skills and progress your career - The freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data and Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Mining & Management, Visualization, Business Analytics, Automation, Statistical Insights, and AI/GenAI. The assignments cover a wide range of countries and industry sectors. We are looking for an Associate Manager - AI/GenAI, proficient in Artificial Intelligence, Machine Learning, deep learning and LLM models for Generative AI, text analytics, and Python Programming. You will be responsible for developing and delivering industry sectors specific solutions which will be used to implement the EY SaT mergers and acquisition methodologies. **Your key responsibilities:** - Develop, review, and implement Solutions applying AI, Machine Learning, Deep Learning, and develop APIs using Python. - Lead the development and implementation of Generative AI applications using open source and closed source Large Language Models (LLMs). - Work with advanced models for natural language processing and creative content generation using contextual information. - Design and optimize solutions leveraging Vector databases for efficient storage and retrieval of contextual data for LLMs. - Understand Business and Sectors, identify whitespaces and opportunities for analytics application. - Work on large to mid-size projects, ensure smooth service delivery, and provide expert reviews. - Communicate with cross-functional teams and manage stakeholder relationships. **Skills and attributes for success:** - Able to work creatively and systematically in a time-limited, problem-solving environment. - Loyal, reliable, with high ethical standards. - Flexible, curious, creative, and able to propose innovative ideas. - Good interpersonal skills, team player, and positive in a group dynamic. - Intercultural intelligence, experience of working in multi-cultural teams. - Ability to manage multiple priorities simultaneously and drive projects to completion with minimal supervision. **To qualify for the role, you must have:** - Experience in guiding teams on AI/Data Science projects and communicating results to clients. - Familiarity with implementing solutions in Azure Cloud Framework. - Excellent Presentation Skills. - 8-10 years of relevant work experience in developing and implementing AI, Machine Learning Models. - Experience in statistical techniques, deep learning, machine learning algorithms. - Proficient in Python programming and SDLC principles. - Willingness to mentor team members and travel extensively. - Excellent written and verbal communication skills. **Ideally, you'll also have:** - Ability to think strategically and build rapport within the firm and with clients. If you are a team player with commercial acumen, technical experience, and enthusiasm to learn new things in a fast-moving environment, and interested in being part of a market-prominent, multi-disciplinary team, this opportunity at EY could be the next step in your career growth.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Senior Engineer Technical Support, you will leverage your 5 to 8 years of experience in Software and Hardware Infrastructure implementation and support. Your role will be multi-functional, encompassing expertise in Software, Hardware, Networking Infrastructure, Storage, Virtualization, Customer Training, and Technical Support. Your responsibilities will include assisting pre-sales teams in crafting techno-commercial proposals, selecting appropriate hardware infrastructure components, estimating pricing, and preparing delivery schedules for product deployments. You will play a key part in designing and implementing deployment architecture for ClearTrails products at customer premises, incorporating cutting-edge technologies in Big Data across hundreds of servers and Peta Byte Scales of Storages. Interacting with customers to address integration issues with Telecom Networks and traveling to customer locations for implementation will be part of your routine. You will also provide first and second level product support, collaborating with QA and Engineering teams to ensure issue resolution within agreed SLAs. In this role, you will be responsible for identifying Hardware and Software requirements, staying updated on the latest trends in Infrastructure, designing deployment architectures, networking, storage, and virtualization solutions, as well as ensuring network security based on customer needs. Automation for deployments and upgrades, diagnosing and resolving customer-reported problems, and providing technical reviews of documentation and specifications will be integral to your tasks. Moreover, you will contribute to knowledge-sharing initiatives, product release training, and documentation. Your involvement in design reviews with customers will be crucial for aligning solutions with their expectations and requirements.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Software Developer specializing in Java and React/Angular, SQL, and API, your main role will involve the design, development, and maintenance of software applications utilizing Java and its related technologies. Your proficiency in React or Angular will be advantageous for creating modern and dynamic user interfaces for web applications. Alongside your Java skills, it is essential to possess a strong understanding of HTML, CSS, and JavaScript, as well as experience with frameworks like Angular or React. For this position, it is crucial to have knowledge in various areas such as software design patterns, Unix environment, database technologies including SQL and NoSQL, and working with databases like Oracle and Netezza. Experience in RESTful web services, API design, full-stack Java development, and familiarity with Angular or React would be highly beneficial. Additional assets include knowledge of Redis and experience with Nifi and APIs. Being part of Agile teams, you will contribute your expertise in Data Engineering and implementing end-to-end DW projects in a Big Data environment. Strong analytical abilities are required for debugging production issues, offering root cause analysis, and implementing mitigation plans. Effective communication, both verbal and written, is essential, along with excellent relationship-building, collaboration, and organizational skills. In this role, you will need to multitask across various projects, interact with internal and external resources, and provide technical guidance to junior team members. Your high-energy, detail-oriented, and proactive approach, combined with the ability to work under pressure independently, will be invaluable. Initiative and self-motivation are key qualities for driving results. You should also be quick to learn and apply new technologies, conducting POCs to identify optimal solutions for problem statements. The flexibility to collaborate in diverse and geographically distributed project teams within a matrix-based environment is also essential for this role.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Big Data - Data Modeller, you will play a crucial role in leading moderately complex initiatives and deliverables within technical domain environments. Your responsibilities will include contributing to large-scale planning of strategies, designing, coding, testing, debugging, and documenting projects and programs associated with technology domain, including upgrades and deployments. Additionally, you will review technical challenges that require an in-depth evaluation of technologies and procedures, resolve complex issues, and lead a team to meet existing client needs or potential new clients needs while leveraging a solid understanding of the function, policies, procedures, or compliance requirements. Collaborating and consulting with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals will be a key aspect of your role. You will also lead projects, act as an escalation point, provide guidance and direction to less experienced staff. Desired Qualifications: - Minimum of 6 years of hands-on experience in Big Data Software Enterprise Application development - Proficiency in continuous integration and delivery practices when developing code - Experience in the Banking/Financial technology domain will be preferred Job Expectations: - Collaborate with scrum stakeholders to implement modernized and sustainable technology roadmaps - Analyze technical requirements and implement software solutions - Stay updated on current and future trends and practices in Technology - Resolve application issues with software solutions and respond to suggestions for improvements and enhancements - Proactively manage risk through the implementation of appropriate controls and escalate where required - Work with Engineering manager, product owner, and Team to ensure the product is delivered with quality, on time, and within budget - Coordinate project interlocks and deployments with internal IT Teams - Possess strong verbal and written communication skills to effectively work in a global development environment This is a full-time position with a day shift schedule, requiring in-person work at the Bangalore location. The application deadline for this opportunity is 08/08/2025.,
Posted 1 week ago
5.0 - 12.0 years
0 Lacs
maharashtra
On-site
Model Risk Management (MRM) is part of the Global Risk Management of Citi and is responsible for Independent Oversight of models across the firm. Citi is seeking a Vice President to join the System Strategy and Oversight Team within Model Risk Management Inventory & Initiative Management Group. The role requires experience in Risk management, SDLC, Waterfall, Iterative and Agile methodologies, and expertise in Project Management and Governance. Experience in process reengineering, business architecture, simplification, controls and UAT. Experience in developing solutions driving automation of Gen AI/ modeling tools or building reporting frameworks would be a big plus. Familiarity with FRB's Supervisory Guidance on MRM SR 11-7 and 15-18. The MRM System Strategy & Oversight (SSO) Lead will be responsible to drive reengineering of MRMS, the Citi Model Risk Management System in line with Model Risk Management Policy and Procedures and overall Model Risk system strategy. They will translate policies, procedures, and guidelines into process maps and concrete tasks, identify dependencies, decision points, actors, opportunities for streamlining, etc., and build system solutions to support. The role involves collaborating with various stakeholders both within and outside Risk management to identify, streamline, simplify, and implement model life cycle processes in MRMS. The responsibilities also include authoring Business requirements, re-engineering processes and system solutions to drive simplification and automation, liaising with IT partners to build effective system solutions, and partnering with validation and development groups to drive integration of metrics and documentation digitization, Gen AI POCs with MRMS target state. The ideal candidate should have 12+ years of working experience with 5+ years in product development or equivalent role. They should be familiar with O&T developing cycle as well as with model risk management or similar. Experience in supporting cross-functional projects with project management, technology on system enhancements is required. Additionally, knowledge/experience with process design, database design, and high proficiency in SQL are essential. Institutional knowledge/experience with Citi platforms/application is preferred. Strong interpersonal skills, project management skills, and experience with Python, R, other programming languages for implementing POCs are desired. Expert level knowledge at MS Excel for data analytics including VBA skills; MS PowerPoint for executive presentations; MS Word for business documentation; MS Visio for process flows and swim lane are also expected. A Bachelor's degree in finance, mathematics, computer science or related field is required, with a Master's Degree being preferred. Working at Citi means joining a family of more than 230,000 dedicated people from around the globe. It offers the opportunity to grow your career, give back to your community, and make a real impact. If you are looking to take the next step in your career, consider applying for this role at Citi today.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The Data Engineer is responsible for developing high-quality data products to meet the Banks regulatory requirements and support data-driven decision making. As a Mantas Scenario Developer, you will set an example for other team members, collaborate closely with customers, and address any roadblocks that may arise. Your expertise in data architecture standards, data warehousing, data structures, and business intelligence will play a key role in contributing to business outcomes within an agile team. Your responsibilities will include developing and supporting scalable, extensible, and highly available data solutions, ensuring alignment with the wider architectural vision, identifying and mitigating risks in the data supply chain, adhering to technical standards, and designing analytical data models. To excel in this role, you should possess a First-Class Degree in Engineering/Technology along with 5 to 8 years of experience in implementing data-intensive solutions using agile methodologies. Proficiency in relational databases, SQL for data querying, data modeling for analytical consumers, and hands-on experience with Mantas throughout the full development life cycle are essential. You should also be able to translate business needs into technical solutions, automate data pipelines, and have a passion for learning new technologies. Key Technical Skills Required: - ETL: Proficiency in building data pipelines using platforms like Ab Initio, Apache Spark, Talend, or Informatica - Mantas: Expertise in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, and FSDM - Big Data: Experience with platforms such as Hadoop, Hive, or Snowflake - Data Warehousing & Database Management: Understanding of Data Warehousing concepts and relational & NoSQL database design - Data Modeling & Design: Exposure to data modeling techniques and maintenance of data models - Languages: Proficiency in Python, Java, or Scala - DevOps: Exposure to CI/CD platforms, version control, and automated quality control management Additional Valuable Technical Skills: - Ab Initio: Experience in developing Co>Op graphs and tuning for performance - Cloud: Exposure to public cloud data platforms like S3, Snowflake, or Redshift - Data Quality & Controls: Familiarity with data validation, cleansing, enrichment, and data controls - Containerization: Understanding of containerization platforms like Docker, Kubernetes - File Formats: Working knowledge of Event/File/Table Formats such as Avro, Parquet, or Protobuf - Others: Basics of Job scheduler like Autosys and Entitlement management Certifications in any of the above topics would be advantageous. Join us as a Data Engineer to leverage your technical skills and contribute to the development of innovative data solutions that drive business success.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Site Reliability Engineering (SRE) Technical Leader on the Network Assurance Data Platform (NADP) team at Cisco ThousandEyes, you will be responsible for ensuring the reliability, scalability, and security of the cloud and big data platforms. Your role will involve representing the NADP SRE team, contributing to the technical roadmap, and collaborating with cross-functional teams to design, build, and maintain SaaS systems operating at multi-region scale. Your efforts will be crucial in supporting machine learning (ML) and AI initiatives by ensuring the platform infrastructure is robust, efficient, and aligned with operational excellence. You will be tasked with designing, building, and optimizing cloud and data infrastructure to guarantee high availability, reliability, and scalability of big-data and ML/AI systems. This will involve implementing SRE principles such as monitoring, alerting, error budgets, and fault analysis. Additionally, you will collaborate with various teams to create secure and scalable solutions, troubleshoot technical problems, lead the architectural vision, and shape the technical strategy and roadmap. Your role will also encompass mentoring and guiding teams, fostering a culture of engineering and operational excellence, engaging with customers and stakeholders to understand use cases and feedback, and utilizing your strong programming skills to integrate software and systems engineering. Furthermore, you will develop strategic roadmaps, processes, plans, and infrastructure to efficiently deploy new software components at an enterprise scale while enforcing engineering best practices. To be successful in this role, you should have relevant experience (8-12 yrs) and a bachelor's engineering degree in computer science or its equivalent. You should possess the ability to design and implement scalable solutions, hands-on experience in Cloud (preferably AWS), Infrastructure as Code skills, experience with observability tools, proficiency in programming languages such as Python or Go, and a good understanding of Unix/Linux systems and client-server protocols. Experience in building Cloud, Big data, and/or ML/AI infrastructure is essential, along with a sense of ownership and accountability in architecting software and infrastructure at scale. Additional qualifications that would be advantageous include experience with the Hadoop Ecosystem, certifications in cloud and security domains, and experience in building/managing a cloud-based data platform. Cisco encourages individuals from diverse backgrounds to apply, as the company values perspectives and skills that emerge from employees with varied experiences. Cisco believes in unlocking potential and creating diverse teams that are better equipped to solve problems, innovate, and make a positive impact.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Senior Programmer Analyst position entails participating in the establishment and implementation of new or revised application systems and programs in collaboration with the Technology team. Your main objective in this role is to contribute to applications systems analysis and programming activities. Responsibilities include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will be responsible for monitoring and controlling all phases of the development process, including analysis, design, construction, testing, and implementation. Providing user and operational support on applications to business users is also a key aspect of your role. You will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments. Additionally, you will recommend and develop security measures in post-implementation analysis of business usage to ensure successful system design and functionality. Consulting with users/clients and other technology groups on issues, recommending advanced programming solutions, and installing and assisting customer exposure systems are also part of your responsibilities. Ensuring that essential procedures are followed, helping define operating standards and processes, and serving as an advisor or coach to new or lower-level analysts are essential tasks in this role. You will be expected to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a subject matter expert to senior stakeholders and/or other team members. Qualifications for this position include: - 8 to 12 years of Application development experience with Java / J2EE technologies. - Experience with Core Java/J2EE Application with complete command over OOPs and Design Patterns. - Proficiency in Data Structures and Algorithms. - Thorough knowledge and hands-on experience with technologies such as BIG data Hadoop knowledge with experience on Hive or Java-based Spark Programming. - Implementation or part of complex project execution in Big Data Spark ecosystem. - Working in an agile environment following best practices of agile Scrum. - Expertise in designing and optimizing software solutions for performance and stability. - Strong troubleshooting and problem-solving skills. - Experience in Test-driven development. Education required for this role: - Bachelors degree/University degree or equivalent experience This is a full-time position in the Technology Job Family Group, specifically within the Applications Development Job Family.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough