Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 12.0 years
13 - 18 Lacs
Pune
Work from Office
Educational Requirements Master Of Comp. Applications,Master Of Science,Master Of Technology,Bachelor Of Computer Science,Bachelor of Engineering,Bachelor Of Technology Service Line Application Development and Maintenance Responsibilities Knowledge of architectural design patterns, performance tuning, database and functional designs Hands-on experience in Service Oriented Architecture Ability to lead solution development and delivery for the design solutions Experience in designing high level and low level documents is a plus Good understanding of SDLC is a pre-requisite Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Should be able to use design patterns to make the application reliable, scalable, and highly available Should be able to design Microservices and Serverless based architecture Work with client architect and define top notch solutions Additional Responsibilities: Good verbal and written communication skills Experience in leading the teams technically Ability to communicate with remote teams in effective manner High flexibility to travel Strong analytical, logical skills and team leading skills Engage in business development, as well as in building and maintaining client relationships Technical and Professional Requirements: Net-Architecture,Cloud Platform -Azure Developer,Cloud Platform -Google Cloud Platform Developer-GCP/ Google Cloud,Cloud Platform-Amazon Web Services Developer-AWS/ PAAS2 Experience in Microsoft .Net Technology with specializing in design, development, Integration, Implementation, testing, delivery and maintenance of enterprise applications Experience in architecture design with Web Applications, Windows Applications, Web Services, and Web API with design of Client/Server, N-tier applications Expertise development experience of writing complex queries and stored procedures, data modeling, implementing tables, views and triggers using Oracle and Microsoft SQL Server data base systems Very strong skills with Object Oriented Programming is needed Experience in Angular or REACT JS Good to have Cloud solution architecture exposure Preferred Skills: .Net-Architecture Technology-UI & Markup Language-Angular JS/Angular 1.x Technology-Cloud Platform-Azure AI Services-Computer Vision with Azure
Posted 1 week ago
7.0 - 9.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Data QualityIDQ Preferred Skills: Technology-ETL & Data Quality-ETL & Data Quality - ALL
Posted 1 week ago
2.0 - 3.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Educational Requirements Master Of Engineering,Master Of Science,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science,Bachelor of Engineering,Bachelor Of Technology Service Line Cloud & Infrastructure Services Responsibilities As an Infra AI Automation Programmer you will work on design, development, and deployment of AI systems that generate content or data, often using techniques such as deep learning, neural networks, and generative models using Python backend coding. Meet with clients in and remotely to understand their business objectives, challenges, and identify potential use cases for Generative AI. Assess the client's existing technology infrastructure, data landscape, and AI maturity to determine the feasibility and best approaches for Gen AI adoption. Conduct workshops and discovery sessions to brainstorm Gen AI applications across various business functions (e.g., content creation, customer service, product development, marketing).Gen AI Strategy and Solution Design Develop tailored Gen AI strategies and roadmaps for clients in outlining specific use cases, implementation plans, and expected business outcomes. Design Gen AI solutions architecture, considering factors such as model selection (e.g., large language models (LLMs), diffusion models), data requirements, integration with existing systems, and scalability. Evaluate and recommend appropriate Gen AI platforms, tools, and APIs (e.g., OpenAI, Google AI, Azure AI, open-source libraries) relevant to the client's needs and the Indian market. Advise on prompt engineering techniques to optimize the output and performance of Gen AI models for specific client applications. Additional Responsibilities: Besides the professional qualifications of the candidates, we place great importance in addition to various forms personality profile. These include: High analytical skills A high degree of initiative and flexibility High customer orientation High quality awareness Excellent verbal and written communication skills Technical and Professional Requirements: At least 2+ years of programming experience in Python Handson experience in working on Gen AI Handson experience in Tensorflow, Pytorch, Langchain and Prompt Engineering Experience with vector databases and retrieval-augmented generation (RAG). Familiarity with MLOps principles and tools for deploying and managing Gen AI models. Understanding of ethical considerations and responsible AI frameworks. Understanding of data structures, data modelling , SQL & NOSQL Object oriented and functional programming & GIT Knowledge of basic algorithms and object-oriented and functional design principles Deep Knowledge of Mathematics, Probability, Statistics, and Algorithms Good Communication skills Good analytical and problem-solving skills Preferred Skills: Technology-Machine Learning-TensorFlow Technology-Machine Learning-Python Technology-Machine Learning-Generative AI
Posted 1 week ago
2.0 - 7.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Educational Requirements Master Of Business Adm.,Master Of Commerce,Master Of Engineering,Master Of Technology,Master of Technology (Integrated),Bachelor Of Business Adm.,Bachelor Of Commerce,Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: At least 2 years of configuration & development experience with respect to implementation of OFSAA Solutions (such as ERM, EPM, etc.) Expertise in implementing OFSAA Technical areas covering OFSAAI and frameworks Data Integrator, Metadata Management, Data Modelling. Perform and understand data mapping from Source systems to OFSAA staging; Execution of OFSAA batches and analyses result area tables and derived entities. Perform data analysis using OFSAA Metadata (i.e Technical Metadata, Rule Metadata, Business Metadata) and identify if any data mapping gap and report them to stakeholders. Participate in requirements workshops, help in implementation of designed solution, testing (UT, SIT), coordinate user acceptance testing etc. Knowledge and experience with full SDLC lifecycle Experience with Lean / Agile development methodologies Preferred Skills: Technology-Oracle Industry Solutions-Oracle Financial Services Analytical Applications (OFSAA)
Posted 1 week ago
2.0 - 3.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore,Jaipur, Vizag, Kolkata, Mysore, HubliWhile we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional Requirements: Candidate should have bachelor's or equivalent degree with minimum 2 years of experience Must have experience in SAP MDG projects, performing MDG Configurations and Customizations in areas like Data Modeling, UI Modeling, Process Modeling, Data Replication Framework, Key Mapping, rules, and derivations, BRF+ Should have expertise in SAP MDG configurations, replication configurations and have technical knowledge of MDG workflows and ERP tables Full life cycle implementation cycle with blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience Should have technical knowledge and hands on experience on customization for MDG workflows, Floor Plan Manager, WDABAP, BRF+, hands on development in BADI ABAP Knowledge on WDABAP framework-based customizations, OOPs programming in SAP MDG, FPM and UI customizations Work with the technical teams to complete the SAP MDG implementation for Material, Customer, Supplier and FI objects Preferred Skills: Technology-SAP Functional-SAP MDG
Posted 1 week ago
9.0 - 11.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Educational Requirements Bachelor of Engineering Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore,Jaipur, Vizag, Kolkata, Mysore, Hubli.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional Requirements: Candidate should have bachelor's or equivalent degree with minimum 9 years of experience Must have experience in SAP MDG projects, performing MDG Configurations and Customizations in areas like Data Modeling, UI Modeling, Process Modeling, Data Replication Framework, Key Mapping, rules, and derivations, BRF+ Should have expertise in SAP MDG configurations, replication configurations and have technical knowledge of MDG workflows and ERP tables Full life cycle implementation cycle with blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience Should have technical knowledge and hands on experience on customization for MDG workflows, Floor Plan Manager, WDABAP, BRF+, hands on development in BADI ABAP Knowledge on WDABAP framework-based customizations, OOPs programming in SAP MDG, FPM and UI customizations Work with the technical teams to complete the SAP MDG implementation for Material, Customer, Supplier and FI objects Preferred Skills: Technology-SAP Functional-SAP MDG
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Kochi
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL
Posted 1 week ago
3.0 - 7.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Your Role and Responsibilities An AI Solution Engineer at IBM is not just a job title - it's a mindset. You'll leverage the watsonx platform to co-create AI value with clients, focusing on technology patterns to enhance repeatability and delight clients.We are seeking an experienced and innovative AI Solution Engineer to be specialized in foundation models and large language models. In this role, you will be responsible for architecting and delivering AI solutions using cutting-edge technologies, with a strong focus on foundation models and large language models. You will work closely with customers, product managers, and development teams to understand business requirements and design custom AI solutions that address complex challenges.Success is our passion, and your accomplishments will reflect this, driving your career forward, propelling your team to success, and helping our clients to thrive. Day-to-Day Duties: Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations.Customer Engagement and SupportAct as a technical point of contact for customers, addressing their questions, concerns, and feedback. Provide technical support during the solution deployment phase and offer guidance on AI-related best practices and use cases.Documentation and Knowledge SharingDocument solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members.Industry Trends and InnovationStay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation. Required education Bachelor's Degree Required technical and professional expertise Understanding of key concepts in the Foundation Models literature and expertise in building and deploying them for real-world examples. Knowledge of cloud technologies and expertise in leveraging them for large-scale AI or FM workloads. Ability to identify fundamental problems from real-world cloud use-cases and to design, build and validate successful AI solutions. Capability to demonstrate and evaluate AI solutions via experimental methods, particularly through hands-on creation of prototypes. Strong communication skills and the ability to collaborate effectively within a local team. Excellent command of the English language, both verbal and written. Preferred technical and professional experience Strong contribution record demonstrated through active participation in open source communities, sharing technical resources and educational content, maintaining influential technical blogs or social media presence, and public evangelism of technology through speaking engagements, workshops and community events. Track record in being part of highly collaborative teams to tackle important problems which produce high impact solutions. Expertise in Python programming and statistical analysis including experience with advances statistical methods, data modelling and implementing efficient algorithmic solutions.
Posted 1 week ago
5.0 - 9.0 years
13 - 17 Lacs
Mumbai
Work from Office
The role creates and maintains strong, trusted client relationships at all levels within the client's business. They design solution recommendations for their clients applying their technical skills, designing centralized or distributed systems that both address the user's requirements and perform efficiently and effectively for banks. He should have an understanding of banking industry and expertise leveraging IBM technologies, architectures, integrated solutions and offerings, focusing on the elements required to manage all aspects of data and information from business requirements to logical and physical design. They should be able to lead information management lifecycle from acquisition, through cleanse, transform, classify and storage to presentation, distribution, security, privacy and archiving. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Data Architect Banking Data architect with deep knowledge on Cassandra, Pulsar, Debezium etc as well as Banking domain knowledge. Preferred technical and professional experience Data architect with deep knowledge on Cassandra, Pulsar, Debezium etc as well as Banking domain knowledge.
Posted 1 week ago
3.0 - 7.0 years
6 - 10 Lacs
Pune
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end-to-end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4 -12 years of experience required. The ABAP on HANA Application Developers would possess the knowledge of the following topics and apply them to bring in value and innovation to client engagementsSAP HANA Technical Concept and Architecture, Data Modelling using HANA Studio, ABAP Development Tools (ADT), Code Performance Rules and Guidelines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live models consumption. Designing and developing, data dictionary objects, data elements, domains, structures, views, lock objects, search helps and in formatting the output of SAP documents with multiple options. Modifying standard layout sets in SAP Scripts, Smart forms & Adobe Forms Development experience in RICEF (Reports, Interfaces, Conversions, Enhancements, Forms and Reports Preferred technical and professional experience Experience in working in Implementation, Upgrade, Maintenance and Post Production support projects would be an advantage Understanding of SAP functional requirement, conversion into Technical design and development using ABAP Language for Report, Interface, Conversion, Enhancement and Forms in implementation or support projects.
Posted 1 week ago
6.0 - 11.0 years
11 - 16 Lacs
Bengaluru
Work from Office
As a Senior Backend /Lead Development Engineer you will be involved in developing automation solutions to provision and manage any infrastructure across your organization. Being developer you will be leveraging capabilities of Terrafrom and Cloud offering to drive infrastructure as code capabilities for the IBM z/OS platform. You will Work closely with frontend engineers as part of a full-stack team, collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhancing the experience of the offering. Required education Bachelor's Degree Required technical and professional expertise * 15+ years of Software development experience with zOS or zOS Sub-systems. * System programmer able to work and support development/testing of IBM Z HW I/O definitions - IODF and IOCDS generation and deployment. * Familiar with HMC and HCD. * 8+ years Professional experience developing with Golang, Python and Ruby * 10+ year of hands-on experience with z/OS system programming or administration experience * Experience with Terraform key features like Infrastructure as a code, change automation, auto scaling. * Experience working with cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. * Cloud-native mindset and solid understanding of DevOps principles in a cloud environment * Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. * Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). * Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. * Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. * Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. * Strong analytical, debugging and problem solving skills to analyze issues and defects reported by customer-facing and test teams. * Proficient in source control management tools (GitHub, ) and with Agile Life Cycle Management tools. * Soft Skills: Strong communication, collaboration, self-organization, self-study, and the ability to accept and respond constructively to critical feedback.
Posted 1 week ago
2.0 - 5.0 years
6 - 11 Lacs
Bengaluru
Work from Office
HashiCorp, and IBM Company (HashiCorp) solves development, operations, and security challenges in infrastructure so organizations can focus on business-critical tasks. We build products to give organizations a consistent way to manage their move to cloud-based IT infrastructures for running their applications. Our products enable companies large and small to mix and match AWS, Microsoft Azure, Google Cloud, and other clouds as well as on-premises environments, easing their ability to deliver new applications. At HashiCorp, we have used the Tao of HashiCorp as our guiding principles for product development and operate according to a strong set of company principles for how we interact with each other. We value top-notch collaboration and communication skills, both among internal teams and in how we interact with our users. The Role As a Frontend Engineer II on the Boundary Transparent Session team at HashiCorp, you will be instrumental in expanding enterprise functionality that allows a VPN-like passive connection experience for customers. This role plays a critical part in ensuring the Boundary Desktop Client supports daily customer workflows in a performant, scalable way. You will be part of a full-stack team including backend and mobile engineers, and collaborate cross-functionally with Product, Design, and other partners. Key Responsibilities Develop and enhance frontend features that provide a VPN-like passive connection experience for customers. Ensure the Boundary Desktop Client supports daily customer workflows in a performant and scalable manner. Work closely with backend and mobile engineers as part of a full-stack team, and collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhanced user experience Required education Bachelor's Degree Required technical and professional expertise 4+ years of software engineering experience with a proven track record of technical or engineering lead roles. Experience with diverse technology stacks and project types is preferred. Proficiency in JavaScript is necessary. Experience with the Ember framework is preferred, or a strong interest and ability to get up to speed with it. Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. Preferred technical and professional experience Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. Proven ability to lead by example, mentor junior engineers, and contribute to a positive team culture. Commitment to developing well-tested solutions to ensure high reliability and performance.
Posted 1 week ago
2.0 - 5.0 years
14 - 17 Lacs
Hyderabad
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 week ago
3.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Advanced Programming Skills in Python,Scala,GoStrong expertise in developing and maintaining microservices in Go (or other similar languages), with the ability to lead and mentor others in this area. Extensive exposure in developing Big Data Applications ,Data Engineering ,ETL and Data Analytics . Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of deploying and managing cloud-native applications. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Required education Bachelor's Degree Required technical and professional expertise 4-7 years' experience primarily in using Apache Spark, Kafka and SQL preferably in Data Engineering projects with a strong TDD approach. Advanced Programming Skills in languages like Python ,Java , Scala with proficiency in SQL Extensive exposure in developing Big Data Applications, Data Engineering, ETL ETL tools and Data Analytics. Exposure in Data Modelling, Data Quality and Data Governance. Extensive exposure in Creating and maintaining Data pipelines - workflows to move data from various sources into data warehouses or data lakes. Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of developing, deploying and managing cloud-native applications. Good to have Front-End Development experienceReact, Carbon, and Node for managing and improving user-facing portals. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Preferred technical and professional experience Hands on experience with Data Analysis & Querying using SQLs and considerable exposure to ETL processes. Expertise in developing Cloud applications with High Volume Data processing. Worked on building scalable Microservices components using various API development frameworks.
Posted 1 week ago
7.0 - 12.0 years
15 - 27 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Altimetrik is Hiring Azure Data Engineer with good experience in Python,Pyspark,SQL,Azure, Data Modelling. Location: Hyderabad,Bangalore,Chennai,Pune. Exp: 7 to 15 Yrs NP: Immediate to 1 week joiners If you are interested, please share your profile @ rmuppidi@altimetrik.com
Posted 1 week ago
8.0 - 13.0 years
10 - 19 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Summary: We are seeking a Senior Salesforce Developer with 8 years of hands-on experience in Salesforce platform development to join our offshore team in India. The ideal candidate will be responsible for the design, development, testing, and deployment of scalable and maintainable Salesforce solutions across Sales Cloud, Service Cloud, and custom applications. This role requires deep technical expertise in Apex, Lightning Web Components (LWC), and Salesforce integrations. Key Responsibilities: Develop, customize, and maintain Salesforce applications using Apex, Visualforce, Lightning Web Components (LWC), and other Salesforce technologies. Participate in the full software development lifecycle from requirements gathering to deployment and support. Build and maintain integrations between Salesforce and external systems using REST/SOAP APIs, middleware tools (e.g.,MuleSoft), or custom connectors. Design and implement custom objects, workflows, process builders, flows, validation rules, and triggers. Optimize and refactor existing code to improve performance, security, and scalability. Collaborate with Salesforce Administrators, Architects, and Business Analysts to understand business needs and provide effective solutions. Develop and maintain unit tests and code coverage according to Salesforce best practices. Ensure compliance with governance limits, security standards, and data integrity policies. Required Skills: Bachelors degree in Computer Science,Information Systems, Engineering, or related field. 8 years of Salesforce development experience, including Apex, Visualforce, and Lightning (Aura and LWC). Salesforce Platform Developer Certifications area plus Proficient in Salesforce APIs (REST, SOAP, Bulk, Streaming) and integration patterns. Experience with Sales Cloud, Service Cloud, and custom app development. Strong understanding of data modeling, sharing rules, and security models in Salesforce. Proficient in JavaScript, HTML, CSS, and working knowledge of front-end frameworks. Experience with DevOps tools like Gearset, Copado, or Salesforce CLI for CI/CD. Familiarity with Salesforce DX, unlocked packages, and scratch orgs.
Posted 1 week ago
4.0 - 6.0 years
20 - 30 Lacs
Gurugram
Work from Office
Key Skills: Spark, Scala, Flink, Big Data, Structured Streaming, Data Architecture, Data Modeling, NoSQL, AWS, Azure, GCP, JVM tuning, Performance Optimization. Roles & Responsibilities: Design and build robust data architectures for large-scale data processing. Develop and maintain data models and database designs. Work on stream processing engines like Spark Structured Streaming and Flink. Perform analytical processing on Big Data using Spark. Administer, configure, monitor, and tune performance of Spark workloads and distributed JVM-based systems. Lead and support cloud deployments across AWS, Azure, or Google Cloud Platform. Manage and deploy Big Data technologies such as Business Data Lakes and NoSQL databases. Experience Requirements: Extensive experience working with large data sets and Big Data technologies. 4-6 years of hands-on experience in Spark/Big Data tech stack. At least 4 years of experience in Scala. At least 2+ years of experience in cloud deployment (AWS, Azure, or GCP). Successfully completed at least 2 product deployments involving Big Data technologies. Education: B.Tech M.Tech (Dual), B.Tech.
Posted 1 week ago
3.0 - 8.0 years
10 - 20 Lacs
Chennai
Work from Office
Key Skills: Snowflake development, DBT (CLI & Cloud), ELT pipeline design, SQL scripting, data modeling, GitHub CI/CD integration, Snowpipe, performance tuning, data governance, troubleshooting, and strong communication skills. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines and ELT workflows using Snowflake SQL and DBT. Utilize SnowSQL CLI and Snowpipe for real-time and batch data loading, including the creation of custom functions and stored procedures. Implement Snowflake Task Orchestration and schema modeling, and perform system performance tuning for large-scale data environments. Build, deploy, and manage robust data models within Snowflake to support reporting and analytical solutions. Leverage DBT (CLI and Cloud) to script and manage complex ELT logic, applying best practices for version control using GitHub. Independently design and execute innovative ETL and reporting solutions that align with business and operational goals. Conduct issue triaging, pipeline debugging, and optimization to address data quality and processing gaps. Ensure technical designs adhere to data governance policies, security standards, and non-functional requirements (e.g., reliability, scalability, performance). Provide expert guidance on Snowflake features, optimization, security best practices, and cross-environment data movement strategies. Create and maintain comprehensive documentation for database objects, ETL processes, and data workflows. Collaborate with DevOps teams to implement CI/CD pipelines involving GitHub, DBT, and Snowflake integrations. Troubleshoot post-deployment production issues and deliver timely resolutions. Experience Requirements: 5-8 years of experience in data engineering, with a strong focus on Snowflake and modern data architecture. Hands-on experience with Snowflake's architecture, including SnowSQL, Snowpipe, stored procedures, schema design, and workload optimization. Extensive experience with DBT (CLI and Cloud), including scripting, transformation logic, and integration with GitHub for version control. Successfully built and deployed large-scale ELT pipelines using Snowflake and DBT, optimizing for performance and data quality. Proven track record in troubleshooting complex production data issues and resolving them with minimal downtime. Experience aligning data engineering practices with data governance and compliance standards. Familiarity with CI/CD pipelines in a cloud data environment, including deploying updates to production using GitHub actions and DBT integrations. Strong ability to communicate technical details clearly across teams and stakeholders. Education: Any Post Graduation, Any Graduation.
Posted 1 week ago
8.0 - 12.0 years
19 - 22 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location- Remote, Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Gurugram
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Hyderabad, Pune
Work from Office
We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data
Posted 1 week ago
6.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Work from Office
As a BI Developer, youll turn raw data into stunning visual stories that drive decisions. Collaborate with clients, create jaw-dropping dashboards, and lead end-to-end BI projects. If you love transforming insights into action and thrive in a vibrant consulting environment, we want you on our team! What Youll Tackle Each Day: End-to-End BI Implementation: Develop and manage the full BI lifecycle from data modelling and report building to delivery and post-implementation support. Tableau Development: Design, Develop, and maintain interactive and visually compelling dashboards and reports in Tableau. SQL Expertise: Write efficient SQL queries for data extraction, transformation, and analysis. PySpark experience is added advantage. Ability to independently manage end-to-end dashboard development projects with minimal supervision, taking full ownership of design, data integration, and deployment activities. Business Knowledge: Collaborate with clients to understand their business needs and provide actionable insights through BI solutions. Cross-Tool Integration: Experience with other BI tools such as Power BI or Qlik Sense. What You'll Bring to the table: 6 to 10 years of experience in Business Intelligence, focusing on Tableau development and SQL, you consistently deliver impactful BI solutions. A strong understanding of data visualization best practices and BI architecture ensures that your dashboards and reports are both visually compelling and highly functional. Exited to work on Gen AI innovations apart from the day-to-day project work Experience in implementing BI solutions for complex business scenarios, you solve real-world challenges through data-driven insights. Standing out from the crowd with your unique expertise and problem-solving abilities, you contribute as a key team member. Known as a go-getter , you approach every project with enthusiasm and drive, ensuring high-quality and timely delivery, even in fast-paced environments. As a mentor and collaborator , you pave the way for others, contributing to a positive, innovative team dynamic. Your innovative mindset helps you find creative ways to leverage data, consistently turning insights into actionable strategies. Not just a hard worker but a fun one , you help foster an enjoyable work environment where collaboration and innovation thrive, making the journey as rewarding as the outcome. What do you get in return? Thrive & Grow with Us: Competitive Salary: Your skills and contributions are highly valued here, and we make sure your salary reflects that, rewarding you fairly for the knowledge and experience you bring to the table. Dynamic Career Growth: Our vibrant environment offers you the opportunity to grow rapidly, providing the right tools, mentorship, and experiences to fast-track your career. Idea Tanks : Innovation lives here. Our "Idea Tanks" are your playground to pitch, experiment, and collaborate on ideas that can shape the future. Growth Chats : Dive into our casual "Growth Chats" where you can learn from the bestwhether it's over lunch or during a laid-back session with peers, it's the perfect space to grow your skills. Snack Zone: Stay fuelled and inspired! In our Snack Zone, you'll find a variety of snacks to keep your energy high and ideas flowing. Recognition & Rewards : We believe great work deserves to be recognized. Expect regular Hive-Fives, shoutouts and the chance to see your ideas come to life as part of our reward program. Fuel Your Growth Journey with Certifications: Were all about your growth groove! Level up your skills with our support as we cover the cost of your certifications
Posted 1 week ago
5.0 - 8.0 years
6 - 10 Lacs
Gurugram
Remote
Key Responsibilities : Design, develop, and maintain data pipelines to support business intelligence and analytics. Implement ETL processes using SSIS (advanced level) to ensure efficient data transformation and movement. Develop and optimize data models for reporting and analytics. Work with Tableau (advanced level) to create insightful dashboards and visualizations. Write and execute complex SQL (advanced level) queries for data extraction, validation, and transformation. Collaborate with cross-functional teams in an Agile environment to deliver high-quality data solutions. Ensure data integrity, security, and compliance with best practices. Troubleshoot and optimize data workflows for performance improvement. Required Skills & Qualifications : 5+ years of experience as a Data Engineer. Advanced proficiency in SSIS, Tableau, and SQL. Strong understanding of ETL processes and data pipeline development. Experience with data modeling for analytical and reporting solutions. Hands-on experience working in Agile development environments. Excellent problem-solving and troubleshooting skills. Ability to work independently in a remote setup. Strong communication and collaboration skills.
Posted 1 week ago
6.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
We are looking for a skilled Senior Oracle Data Engineer to join our team at Apps Associates (I) Pvt. Ltd, with 6-10 years of experience in the IT Services & Consulting industry. Roles and Responsibility Design, develop, and implement data engineering solutions using Oracle technologies. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data pipelines and architectures. Ensure data quality, integrity, and security through data validation and testing procedures. Optimize data processing workflows for improved performance and efficiency. Troubleshoot and resolve complex technical issues related to data engineering projects. Job Requirements Strong knowledge of Oracle Data Engineering concepts and technologies. Experience with data modeling, design, and development. Proficiency in programming languages such as Java or Python. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.
Posted 1 week ago
0.0 - 1.0 years
1 - 3 Lacs
Bengaluru
Work from Office
Responsibilities: * Design, develop & maintain integrations using Python, JavaScript, Groovy, C++ & Java. * Collaborate with cross-functional teams on data modeling, XML/JSON/CSV parsing & SQL queries.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France