Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 7.0 years
6 - 9 Lacs
Gurugram
Work from Office
"Strong experience with machine learning frameworks such as TensorFlow, PyTorch, or scikit-learn. Proficiency in programming languages such as Python, Pyspark Experience with cloud platforms such as AWS, Azure, Google Cloud, or Onpremise Familiarity with containerization technologies like Docker and Kubernetes. Experience with workflow orchestration tools such as Kubeflow, Airflow, and Argo. Knowledge of CI/CD tools such as Jenkins, GitLab, or CircleCI. Experience with model interpretability tools such as LIME and SHAP." .
Posted 2 weeks ago
6.0 - 12.0 years
8 - 14 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
If youre someone with energy, drive, and a passion for innovation, you will be surrounded by like minds at AST. We are Oracle Platinum Partners, who strive every day to be the best at what we do. Job Summary: We are seeking a strong technical consultant with 6 to 12 years of experience in OIC (Oracle Integration Cloud) implementation projects with proven experience in designing solutions. Skills required: Technical delivery in OIC involving Oracle SaaS (Oracle ERP Cloud, Oracle HCM Cloud, or Oracle SCM Cloud), Oracle EBS, and Oracle Cloud Infrastructure Hands-on experience in Fusion Integration technologies, OIC, and BI Publisher reports. Strong knowledge of Oracle PaaS technologies like OIC, PCS, and VBCS. Expertise in end-to-end Integration development on Cloud and on-premises. Hands-on experience in Fusion Reporting technologies like BI Publisher and OTBI. Working knowledge on Oracle Cloud Conversions and BPM Approvals will be preferred. Should be well-versed with the basics of OCI. Should have working knowledge of Oracle Cloud Functional processes to be able to design solutions independently. Qualifications 6+ years of experience as an Oracle Integration Consultant Minimum 2 years experience in Oracle Cloud Technologies. Worked with the global team and teams across locations Project documentation, including technical design documents, testing scripts Having a Certification relevant to Oracle Technologies will be an added advantage. Good communication skills Statement of Non-Discrimination : We value global diversity and are committed to building a diverse and inclusive workplace where we learn from each other. AST is proud to be an equal opportunity employer, making all employment decisions, including recruiting, hiring, training, and promoting without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, veteran status or any other characteristic or classification protected by law. #LI-DNI
Posted 2 weeks ago
6.0 - 8.0 years
8 - 10 Lacs
Noida, Indore
Work from Office
6-8 years of good hands on exposure with Big Data technologies - pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. IAM, Glue, EMR, RedShift, S3, Kinesis) Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Good to have: Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Create cost effective AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai, Navi Mumbai
Work from Office
Job RoleOracle Field Service Cloud Support Engineer LocationIndia Offshore DescriptionWe are looking for a technofunctional person who has realtime handson functional and technical experience in Oracle Field Service Cloud andor worked with L2 L3 level support Responsibilities include providing Customer Service on a Functional level and to ultimately drive complete and total resolution of each issue reported by customer QualificationsEducation and Experience BE BTech MCA or equivalent preferred Other qualifications with adequate experience may be considered 3 to 6 years relevant working experience FunctionalTechnical Skills Must have good understanding of the Field Service Cloud version 24C or latest capabilities ImplementationSupport experience on Field Service Cloud modulesFeatures like Capacity Collaboration Core Manage Customer Communication Forecasting Mobility Routing Smart Location etc ImplementationSupport experience on Field Service Cloud Technologies like in Java C Net HTML5 PHP and CSS Ability to relate the product functionality to business processes and thus offer implementation advice to customers on how to meet their various business scenarios using Commerce Cloud Technically good Skills in knockoutjs java script html css nodejs Rest APIs Json Postman
Posted 2 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Associate Lead - Kubernetes Platform Is your passion for Cloud Native Platform ? That is, envisioning and building the core services that underpin all Thomson Reuters products ? Then we want you on our India -based team ! This role is in the Platform Engineering organization where we build the foundational services that power Thomson Reuters products. We focus on the subset of capabilities that help Thomson Reuters deliver digital products to our customers . Our mission is to build a durable competitive advantage for TR by providing building blocks that get value-to-market faster. About the Role: This role is within Platform Engineering s Service Mesh team, a dedicated group which engineers and operates our Service Mesh capability, which is a microservice platform based on Kubernetes and Istio. Primarily work with AWS and Azure public cloud, especially Kubernetes (AWS EKS and Azure AKS), Service Mesh technology like Istio, Terraform, Datadog, PagerDuty and Python, Golang, Java and/or .Net Core Programming- Golang, Other - Java, C# & Primary Skill: Golang, Kubernates Work closely with an architect, establish and entrench the architectural design & principles for Service Mesh Participate in all aspects of the development lifecycle: Ideation, Design, Build, Test and Operate . We embrace a DevOps culture ( you build it, you run it ); while we have dedicated 24x7 level-1 support engineers, you may be called on to assist with level-2 support About You: 6+ years software development experience 2+ years of experience building cloud native infrastructure, applications and services on AWS, Azure or GCP Hands-on experience with Kubernetes , ideally AWS EKS and/or Azure AKS Experience with Istio or other Service Mesh technologies Experience with container security and supply chain security Experience with declarative infrastructure-as-code, CI/CD automation and GitOps Experience with Kubernetes operators written in Golang A bachelors degree in computer science , Computer Engineering or similar #LI-AD2 What s in it For You? Hybrid Work Model: We ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here . Learn more on how to protect yourself from fraudulent job postings here . More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Intro Marketing Language Join us for an exciting opportunity to lead and innovate in data engineering, enhancing your career in a dynamic environment. Job Summary As a Data Engineering VP at JPMorgan Chase within the strategic PNA Data platform, you will be responsible for leading the technical delivery. Your role will involve designing and implementing a cloud-native data platform to support Finance Planning and Analysis across more than 50 markets. You will also collaborate with cross-functional teams to build a robust data platform and manage the development infrastructure of the platform. Job Responsibilities Lead the design and implementation of a cloud-native PNA Data platform. Develop and maintain robust ETL processes for data integration. Collaborate with various teams to support data platform capabilities. Manage platform development infrastructure and support users. Design strategies for effective data lake utilization. Leverage ETL tools and DevOps practices for platform engineering. Ensure high-quality software delivery and platform stability. Required Qualifications, Capabilities, and Skills Formal training or certification in software engineering concepts with 5+ years of applied experience. Strong hands-on experience in Python, SQL, and database development. Proficiency in cloud platforms (AWS ) and containerization technologies like Docker and Kubernetes. Solid understanding of data structures, caching, multithreading, and asynchronous communication. Strong collaboration skills and experience in leading teams. Preferred Qualifications, Capabilities, and Skills Experience with Databricks and data orchestrator tools like Airflow. Familiarity with data governance and metadata management. Exposure to No-SQL databases like MongoDB. Experience with messaging technologies like Kafka, Kinesis.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Our team works together to build a wide variety of web experiences that power, extend, and showcase Visas Cloud Platform, such as provisioning and managing CloudView Environments for Containers. As a part of this team, you will lead a team and implement efficient, lean, usable, and developer-friendly systems. You will develop into a Subject Matter Expert (SME) in UI/UX. You will also work in a rapidly iterative environment where we prototype new user interfaces and ship new code frequently. You will work to standardize the User Experience across the Visa Cloud Platform. Job responsibilities Lead a UI team to ensure design and quality are meeting requirements Collaborate with CloudView UI Lead to implement work streams with quality Prototype UIs and iterate on design Build and maintain UI automated tests Develop performant and lean code Introduce AI and Auto Suggest tools in User Flows Work independently, driving projects end to end and provide guidance to junior developers Create documentation and procedures for development, deployment and maintenance When required should be able to setup/debug Backend API (Java) code This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Basic Qualifications:- 7 or more years of relevant work experience with a Bachelor s Degree or at least 5 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 3 years of work experience with a PhD Preferred Qualifications:- 7 or more years of work experience with a Bachelor s Degree or 5 or more years of relevant experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) or up to 3 years of relevant experience with a PhD Experienced with User Experience (UX) and Subject Matter Expert (SME) in UI Work independently with minimal supervision, driving projects end to end and providing guidance to Senior Developers Strong communication skills. Collaborate and talk through your architecture and delivery decisions Experience of building REST based web applications using AnglarJS Good to have knowledge of API Gateway (NGINX) Good to have experience of building software/frameworks for infrastructure automation/PaaS/Continuous Delivery Good to have knowledge of Cloud technologies, Docker, Kubernetes, Istio, GitOps, Jenkins
Posted 2 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Noida, Indore
Work from Office
5-7 years of good hands on exposure with Big Data technologies - pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Good to have: Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. IAM, Glue, EMR, RedShift, S3, Kinesis) Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Create cost effective AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc.
Posted 2 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
Bengaluru
Work from Office
We help the world run better Job Title: Engineer - Business Data Cloud - Data Product Runtime Team Job Description: As a n Engineer in the Data Product Runtime Team , you will be a crucial part of the team expansion in Bangalore, contributing to SAP Business Data Cloud initiatives. This role offers an opportunity to learn from experienced engineers and develop skills in Spark optimization, scalable data processing, and data transformation pipelines as part of SAPs Data & AI strategy. Responsibilities: Support the implementation and evolution of a scalable data processing framework running on Spark. Participate in the creation and optimization of pluggable data transformation pipelines. Optimize CI/CD workflows using GitOps practices. Apply SQL skills in support of data transformation initiatives. Learn to incorporate AI & ML technologies into engineering workflows. Engage with SAP HANA Spark in data processing tasks. Collaborate with global colleagues for effective project contribution. Qualifications: Basic experience in data engineering and distributed data processing. Proficiency in Python (PySpark) is essential; knowledge of Scala and Java is beneficial. Familiarity with Spark optimization and scalable data processing. Foundation in developing data transformation pipelines. Experience with Kubernetes, GitOps , and modern cloud stacks . Interest in AI & ML technologies and industry trends. Good communication skills for effective collaboration in a global team. Eagerness to learn about SAP Data Processing solutions and platform initiatives. Bring out your best . Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 426942 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid. Requisition ID: 426942 Posted Date: May 27, 2025 Work Area: Software-Design and Development Career Status: Professional Employment Type: Regular Full Time Expected Travel: 0 - 10% Location:
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai
Work from Office
We help the world run better Responsible to accelerate S/4HANA Public Cloud mindshare within INDIA, also responsible for solution, nurture topic, strategic sales program or designated segment determined to require a period of extraordinary attention. The scope of activities spans from thought leadership, creating awareness in the market, best practice development, enabling field sales / value adding teams / partner organizations, creating accretive pipeline and engaging in existing opportunities with their respective solution/topic of focus. Customer facing role which acts as accelerator to existing sales teams and customer opportunities, and identifies new business through market or partner engagement. Is an important leverage point for enablement, adoption, best practice development and identification of target customers & campaigns for engagement by the field. Core tasks include: Responsible for overall success of S/4 Public Cloud business in liaison with all functions and leadership within Market Unit Maintain overall Health of Business thru Pipeline/Demand, Partner Eco-System, Enablement across Sales, Presales & Partners. Participate in regular Forecast meetings at MU and Regional Level, responsible for Volume (NN) and Value Forecast management for Market Unit. Help, Guide & drive other ASE s within MU as a team leader Define target account lists via the creation of ideal customer profiles Bring focus and accelerated enablement to sales teams (Sales, Solution Sales, value adding teams, partner organizations) to insure high field adoption Educate sales teams on curated positioning & value proposition for specified focus solution/area and consulting with customers accordingly Create accretive pipeline via direct engagement, through the Sales field, through the ecosystem and via demand management Inform the global solution areas of new requirements, successful campaigns or segmentations, innovations with customers Engage with the field on deal strategy - and throughout the full opportunity lifecycle, where needed Consult with the field on deal execution with regard to focus-specific contracting & deal structure Consult on and assess forecast of focus area Build strong sales best practices to incubate repeatable, structured approach for new businesses Accountability Responsible for driving overall business outcome of S/4HANA Public Cloud in Market Unit responsible for delivery of outcome of assigned projects or areas of responsibility internally recognized senior on complex technical and business matters works on large, complex activities, using demonstrated creativity and expertise and applying specialist professional knowledge to deliver high quality results / technical solutions collaborates in devising long-term concepts may include team lead or supervisory responsibilities Complexity contributes independently, resolves complex issues in own specialist area (e.g. cross-functional or cross-country projects) works independently on topics while setting priorities having sole responsibility provides regular project status and updates decisions/solutions can enhance essentially current and future design and strategy enhance complex systems & processes Experience advanced technical or business skills and special knowledge in one / several areas individuals with a customer focus have developed the acumen to cultivate and develop lasting customer relations typically several years experience with increasing amount of responsibility Communication builds and maintains partnerships with internal and external customers and partners contributes actively to build common ground for cooperation communicates clear and conveying processes & policies in a way that others can understand communicates relevant messages in a timely manner and with constructive feedback to cross functional colleagues & managers Experience & Education Requirements Sound understanding of Cloud ERP markets and related solution area including competitive and SAP suite of products Prior experience with software/IT organizations and with a demonstrated proficiency of Enterprise and/or LoB software solutions through Solution Management, Sales, Presales, Consulting or Business Development roles. SAP product experience and/or SAP sales experience. Working knowledge of cloud, Hosted Services, SaaS/ PaaS models and cloud-based commerce/ business networks. Capable of leveraging a professional network resulting in market, pipeline and revenue growth for SAP. Proven track record of success in the selected industry area. Customer facing experience. Fluency in English, any other language an asset. Fluency in the language of local markets desirable. Education Bachelors degree (or equivalent) required, MBA or equivalent degree required from accredited university preferred. Completion of Sales Methodology training desirable
Posted 2 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Pune
Work from Office
This is a Java AWS stack development profile requiring 4-6 years of experience, along with hands-on exposure to DevOps tools. Must have experience in developing applications using AWS services (EKS, RDS, EC2, S3, SQS, EMR, etc.) Must have experience in developing application using Java stack. Must have experience in developing application using Kubernetes and Docker Must have experience with GitOps CI/CD tools like Github actions etc Must have experience in Spring boot, spring security - standards, practices, and expertise Must have monitoring/Alerting experience using DataDog, Grafana Experience with Terraform + helm or other cloud agnostic tools Good to have experience developing with Python, Groovy, Bash or other common languages used in AWS
Posted 2 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
We help the world run better Job Title: Senior Engineer - Business Data Cloud - Data Product Runtime Team Job Description: As a Senior Engineer in the Data Product Runtime Team , you will be an essential asset to the expansion of our Foundation Services team in Bangalore . Your expertise in Spark optimization, scalable data processing, and data transformation pipelines will drive SAP Business Data Cloud initiatives, supporting SAPs Data & AI strategy. Responsibilities: Design , Implement and evolve capabilities of a highly scalable data processing framework running on Spark. Implement and refine pluggable data transformation pipelines for enhanced data processing efficiency. Enhance CI/CD workflows through GitOps methodologies for greater operational efficiency. Apply SQL skills to execute and optimize data transformations in large-scale projects. Integrate AI & ML technologies into engineering processes, adhering to industry best practices. Utilize SAP HANA Spark for effective data processing solutions. Communicate effectively with global colleagues to ensure successful project delivery. Qualifications: Solid experience in data engineering, distributed data processing, and SAP HANA or related databases. Proficiency in Python ( PySpark ) is required ; Scala and Java are beneficial. Experience in Spark optimization and scalable data processing. Expertise in developing data transformation pipelines. Familiarity with Kubernetes, GitOps , and modern cloud stacks Understanding of AI & ML technologies and industry trends. Excellent communication skills for teamwork in a global context. Background in SAP Data Processing solutions is a plus . Bring out your best . Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 426935 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid. Requisition ID: 426935 Posted Date: May 27, 2025 Work Area: Software-Design and Development Career Status: Professional Employment Type: Regular Full Time Expected Travel: 0 - 10% Location:
Posted 2 weeks ago
9.0 - 13.0 years
35 - 40 Lacs
Bengaluru
Work from Office
We help the world run better Job Title: Engineering Expert - Business Data Cloud - Data Product Runtime Team Job Description: As a n Engineering Expert, you will be instrumental in the expansion of our Foundation Services team in Bangalore. Your profound expertise in distributed data processing, Spark optimization, and data transformation pipelines will drive scalable data solutions, reinforcing SAP Business Data Clouds pivotal role in SAPs Data & AI strategy. Responsibilities: Lead the design and execution of optimized data processing solutions . Spearhead Spark optimization strategies to maximize performance and scalability in data processing. Architect and refine pluggable data transformation pipelines for efficient processing of large datasets. Implement GitOps practices to advance CI/CD pipelines and operational efficiency. Apply advanced SQL skills to refine data transformations across vast datasets. Integrate AI & ML technologies into high-performance data solutions, staying informed on emerging trends. Utilize SAP HANA Spark to drive innovation in data engineering processes. Mentor junior engineers, fostering a culture of continuous improvement and technical excellence. Collaborate effectively with global stakeholders to achieve successful project outcomes. Qualifications: Extensive experience in data engineering, distributed data processing, and expertise in SAP HANA or similar databases. Proficiency in Python (PySpark) is essential; knowledge of Scala and Java is advantageous. Advanced understanding of Spark optimization and scalable data processing techniques. Proven experience in architecting data transformation pipelines. Knowledge of Kubernetes, GitOps , and modern cloud stacks . Strong understanding of AI & ML technologies and industry trends. Effective communication skills within a global, multi-cultural environment. Proven track record of leadership in data processing and platform initiatives.
Posted 2 weeks ago
2.0 - 5.0 years
1 - 6 Lacs
Pune
Hybrid
So, what’s the role all about? Serve as one of the top-performing and proficient engineers in designing and developing high-quality software that meets specified functional and non-functional requirements within the time and resource constraints given. How will you make an impact? Has strong capabilities in his/her component level area of ownership Analyses information, solves problems and reaches conclusions within his/her professional space Responsible for delivering feature(s) for enterprise-grade software independently Acts as a point of escalation for low-level problems within NICE and some external interfaces Mentor other junior team members Interface with R&D Groups (including Product Managers), Sales, Customer Support and Services Conducts timely Code reviews for delivery Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 2-4 years of software development experience Working experience in Core Java, proficient with Java algorithms and data structures Worked in high performance, highly available and scalable systems. Strong experience with Angular 12+ Strong Development experience creating RESTful Web APIs Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Experience working in and driving Continuous Integration and Delivery practices using industry standard tools such as Jenkins. Ability to work independently and collaboratively, good communication skill. Able to resolve problems of moderate scope which requires an analysis based on a review of a variety of factors. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7284 Reporting into: Tech Manager, Cloud Operations Role Type: Individual Contributor
Posted 2 weeks ago
10.0 - 20.0 years
20 - 35 Lacs
Noida
Work from Office
Experience- 10+ Years Job Profile: Analyze business requirements and translate them into technical architecture. Document architecture decisions, diagrams, and specifications. Ensure architectural compliance with enterprise standards and frameworks. Technical Skills: Language: Java, Node or Go-lang. Cloud Preferred: AWS or Azure both will give Advantages (Stacks of both Azure Service Bus, SQL database, Redis (Azure cache), Blob storage, API management services, key vault, AWS IOT Core, DynamoDB, Lambda, Elastic search, Glue, Athena, EKS, SQS, SNS, AWS Kinesis Stream, Kinesis Firehose, Lambda, API Gateway, Cloud front, Cognito. GEN AI tools: AWS LEX, Bedrock, Vector DB. Code optimization GEN AI tool: GitHub Copilot, Amazon Q, Cluster AI. Cloud build tool - Cloud foundry, Tera Form, Cloud formation or Any others. UI or APP frameworks: Flutter, React.js, react native. NO SQL & Relational DB: Neo4j, Casandra, Influx, DYNAMO DB, Mongo DB, cosmos DB, Oracle, MySQL, MS SQL server. Framework: Any messaging solution like Kafka or Kinesis data stream, Docker, Kubernetic, Spring Framework Experience in Analytical Solution. Key Responsibility Areas: Understanding Business Needs: need to gather and analyze business requirements, identifying opportunities and challenges. Technical Design and Implementation: They design and implement technical solutions, selecting appropriate technologies and ensuring compliance with non-functional requirements like performance, scalability, and security. Stakeholder Management: Solution Architects communicate with various stakeholders, including business users, technical teams, and management, to ensure alignment and buy-in. Documentation and Best Practices: They are responsible for documenting the solution architecture, creating technical documentation, and promoting best practices within the organization. Risk Management: Solution Architects identify and mitigate potential risks related to the chosen solution and its implementation. Collaboration : They collaborate with other architects, developers, and project managers throughout the project lifecycle.
Posted 2 weeks ago
3.0 - 8.0 years
3 - 8 Lacs
Prayagraj
Work from Office
Support for Juniper s M/T/Mx/PTX/ACX series products Supporting critical network infrastructures of Enterprise/Telecom/Cloud customers Knowledge of IP Packet flow, OSI layers Layer 3-IP & related technologies IP routing protocols Layer2 technologies
Posted 2 weeks ago
12.0 - 17.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Your Impact You will lead the engineering team to build new cloud-delivered security services that help protect an organizations web traffic. You will be responsible for architecting and designing solutions for complex features in the product. Lead the team during complex production issues and help deliver quick resolution. Partner with cross-functional core stakeholders within both engineering, product management and field teams to lead the technology direction and facilitate key technical decision points. Collaborate with multiple engineering groups within Cisco to deliver E2E solutions. Work closely with other Product leads to drive product and technology strategy. Create a culture of execution excellence coupled with innovation. Building a team of talented and passionate leaders as we scale. Minimum Qualifications: 12+ years of software development experience with a strong track record in technology strategy and product delivery leadership. 5+ years proven experience in system design, development, and delivery of highly scalable SaaS products (AWS preferred). Strong technical expertise in C programming; good to have experience with Golang and Python. Strong grasp of networking fundamentals (L3/L4, TCP/IP, HTTP/HTTPS, Proxy, TLS). Proven experience with containerization technologies like Docker and Kubernetes. Solid understanding of Cloud technologies: Architecture, Design patterns, Productionization, Monitoring, Scaling, and Operations. Valued domain knowledge in Security (encryption, firewalls, secure systems design, vulnerability management, secure software design, credential management,1 etc.). Preferred Qualifications Hands-on working experience with data path technologies like VPP/DPDK. Experience in building cloud delivered SaaS based security solutions. Good knowledge of emerging cyber threats and threat intelligence techniques to detect and remediate threats. Master's degree on Computer Science or Electrical Engineering preferred.
Posted 2 weeks ago
7.0 - 9.0 years
22 - 30 Lacs
Bengaluru
Work from Office
Technical Lead: Job Summary: We are seeking a highly skilled and experienced SalesforceTechnical Lead to spearhead the development and support of applications builton Salesforce.com, specifically focusing on Salesforce Service Cloud, ExperienceCloud and Identity, Core cloud. The ideal candidate will possess deep technicalexpertise in Salesforce development, including LWC, Aura, LWR, Apex, andrelated technologies, and will lead a team of developers in deliveringhigh-quality, scalable solutions. This role requires strong leadership, excellent communication skills, and a commitment to adhering to best practicesand Agile/SDLC processes. Responsibilities: Technical Leadership and Architecture: Provide technical leadership and guidance to the development team. Do hands on implementation on the feature as per project priorities Have atleast 7-9 yrs of experience in handling Saleforce Community/Experience, Identity and Core clouds Design and architect scalable and robust Salesforce solutions, particularly within Salesforce Service Cloud, Experience Cloud and Identity. Ensure adherence to Salesforce best practices, coding standards, and governor limits. Oversee the implementation of complex Apex code, LWC, LWR, Aura components, and integrations. Development and Implementation: Lead the development and implementation of custom Salesforce solutions, including Experience Cloud portals, community LWC components, LWR and CMS integrations. Hands on experience in multi region and global markets for more than 30+ markets Develop and maintain high-quality code, ensuring code reviews are conducted and standards are met. Utilize SFDX Scratch Orgs for development, adhering to Git, source-driven development, and trunk-based development principles. Ensure features are usable in both mobile and desktop environments. Code Review and Quality Assurance: Conduct thorough code reviews to ensure code quality, performance, and security. Enforce coding standards and best practices. Comprehensive Documentation of Technical Design & Deployment Documents. Release Management and Support: Support releases and post-release activities, ensuring smooth deployments. Team Collaboration and Mentorship: Mentor and guide junior developers, fostering a collaborative and supportive team environment. Collaborate with business analysts and QA engineers to ensure alignment on requirements and testing. Work closely with client leads to review designs and provide final code reviews. Process Adherence and Reporting: Follow established SDLC processes, code repository guidelines, and access controls. Utilize JIRA for task management and tracking. Provide weekly status and progress reports to project management. Required Skills and Experience: Bachelor's degree in Computer Science, Software Engineering, or a related field. 8+ years of experience in Salesforce development, with a focus on Experience Cloud and Identity. Deep expertise in Apex, LWC, LWR, Aura, Flows, and other Salesforce platform features. Strong understanding of Salesforce governor limits, data access, sharing, and security. Experience with SFDX Scratch Orgs, Git, and source-driven development. Proven experience leading a team of Salesforce developers. Ensure the projects deliverables are in align with defined SLAs Excellent communication and interpersonal skills. Ability to work effectively in an Agile/Scrum environment. Excellent written and verbal communication. Preferred Skills: Salesforce certifications (e.g., Platform Developer I/II, JavaScript Developer I). Experience with CI/CD pipelines.
Posted 2 weeks ago
3.0 - 7.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Who We're Looking For A hustler , Someone who doesn't wait for leads but hunts them down. Tech-curious If you think "tech is tough," this isn't for you. If you love exploring how AI, Cloud, and Cybersecurity work, keep reading. Fast learner If you don't know something, you take the initiative to learn it. Confident & Persuasive You can talk business and tech without getting lost. Self-motivated No hand-holding. You take ownership of your targets.* Key Responsibilities Lead Generation & Outreach Identify and engage with potential clients via LinkedIn, cold calls, and industry events. Understand Client Pain Points Ask the right questions and map our tech solutions to their business needs. Sales Pipeline Management – Track deals from lead to closure using CRM tools. Tech Learning & Upskilling – Stay updated on Cloud, AI, DevOps, and SaaS trends (We will provide training, but the effort is on you...!). Collaborate with Tech Teams – Bridge the gap between clients and developers, ensuring smooth deal execution. Close Deals & Achieve Targets – Drive revenue growth by sealing high-value IT service contracts.* Must-Have Skills 3-7 years of experience in IT sales , SaaS sales , or technology-related business development._(Juniors with strong tech interest can apply. Strong understanding of Cloud, AI, or DevOps (or willingness to learn fast). Confident communicator – Fluent in English, able to engage CXOs and decision-makers. Experience using CRM tools like HubSpot, Zoho, or Salesforce. Ability to sell solutions, not just services – You don’t push products, you solve problems.* Why Join Tech4Biz Solutions? No Corporate Bureaucracy – You have the freedom to execute ideas. Fast Career Growth – Perform well, and you’ll move up quickly. Unlimited Earning Potential – Performance-based commissions (High-ticket sales = high income). Tech-Driven Sales Training – We’ll equip you with all the tech knowledge you need – if you're willing to put in the work. Work with Visionaries – We build products that matter, and you'll be a key driver of growth. DO NOT APPLY IF: You think sales is just about "following up." You avoid learning about technology and rely only on marketing materials. You need micro-management – we want go-getters who thrive on autonomy.* Role: Business Development Executive (BDE) Industry Type: IT Services & Consulting Department: Sales & Business Development Employment Type: Full Time, Permanent Role Category: BD / Pre Sales Education UG: B.Tech/B.E. in Electronics/Telecommunication, Information Technology, Computers. For more details, please connect with our HR department. You can reach Sneha at sneha.kumari@tech4biz.io . or call -HR Sneha-7892166713
Posted 2 weeks ago
5.0 - 9.0 years
6 - 11 Lacs
Noida
Work from Office
Job Description Tech Stack Java + Spring Boot AWS (ECS, Lambda, EKS) Drools (preferred but optional) APIGEE API observability, traceability, and security Skills Required: Strong ability to understand existing codebases, reengineer and domain knowledge to some extent. Capability to analyze and integrate the new system s various interfaces with the existing APIs. Hands-on experience with Java, Spring, Spring Boot, AWS, and APIGEE . Familiarity with Drools is an added advantage. Ability to write and maintain JIRA stories (10-15% of the time) and keeping existing technical specifications updated would be required. Should take end-to-end ownership of the project , create design, guide team and work independently on iterative tasks. Should proactively identify and highlight risks during daily scrum calls and provide regular updates. Mandatory Competencies Java - Core JAVA Others - Micro services Java Others - Spring Boot Cloud - AWS Lambda Cloud - Apigee Beh - Communication and collaboration At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.
Posted 2 weeks ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 2 weeks ago
6.0 - 8.0 years
40 - 50 Lacs
Mumbai, Pune
Hybrid
Congratulations, you have taken the first step towards bagging a career-defining role. Join the team of superheroes that safeguard data wherever it goes. What should you know about us? Seclore protects and controls digital assets to help enterprises prevent data theft and achieve compliance. Permissions and access to digital assets can be granularly assigned and revoked, or dynamically set at the enterprise-level, including when shared with external parties. Asset discovery and automated policy enforcement allow enterprises to adapt to changing security threats and regulatory requirements in real-time and at scale. Know more about us at www.seclore.com You would love our tribe: If you are a risk-taker, innovator, and fearless problem solver who loves solving challenges of data security, then this is the place for you! Role: Lead Product Engineer - Developer Productivity Experience: 6 - 8 Years Location: Mumbai/Pune A sneak peek into the role: We are seeking a highly motivated and experienced Lead, Developer Productivity & Platform Engineering to spearhead our efforts in building, scaling, and continuously improving our internal developer platform. In this critical role, you will be responsible for empowering our development teams with the tools, infrastructure, and processes necessary to achieve exceptional productivity, accelerate software delivery, and enhance their overall experience. You will driving the vision, strategy, and execution of our IDP initiatives, with a strong focus on measuring and improving developer effectiveness. Here's what you will get to explore: Leadership: This role blends the responsibilities of an individual contributor with the need to lead a team as the practice grows. While the primary focus is on individual contributions and expertise, the role also requires guiding, mentoring, and coordinating the work of others. Foster a collaborative, innovative, and results-oriented team culture. Define clear roles, responsibilities, and performance expectations for team members. Platform Vision, Strategy & Roadmap: Define and articulate a clear vision, strategy, and roadmap for our internal developer platform (IDP), aligning with overall engineering and business objectives. Identify and prioritize key features and improvements for the IDP based on developer needs and productivity goals. Stay abreast of industry trends and emerging technologies in platform engineering, developer experience, and IDPs (e.g., Backstage). Collaboration & Stakeholder Management: Work closely with application development teams, product managers, security teams, operations, and other stakeholders to understand their pain points, needs, and requirements for the IDP. Effectively communicate the value and progress of the IDP to both technical and non-technical audiences. IDP Design, Development & Maintenance: Lead the design, development, and maintenance of core components of our internal developer platform, emphasizing self-service capabilities, automation, standardization, and a seamless developer experience. Drive the adoption of Infrastructure as Code (IaC), Continuous Integration/Continuous Delivery (CI/CD), and robust observability practices within the platform. Ensure the IDP is scalable, reliable, secure, and cost-effective. Focus on Developer Productivity & Measurement: Define and track key metrics to measure the impact of the IDP on developer productivity (e.g., deployment frequency, lead time for changes, time to recovery, developer satisfaction). Implement mechanisms for collecting and analyzing data related to developer workflows and platform usage. Identify and implement solutions to streamline developer workflows, reduce toil, and accelerate application delivery based on data and feedback. Potentially lead initiatives to integrate and leverage tools like Backstage to enhance developer experience and provide a centralized platform. Tooling & Integration: Evaluate and integrate relevant tools and technologies into the IDP ecosystem, including CI/CD systems, monitoring tools, logging solutions, security scanners, and potentially IDP frameworks like Backstage. Ensure seamless integration between different platform components and existing development tools. We can see the next Entrepreneur At Seclore if you: 6+ years of relevant experience in software engineering, platform engineering, or DevOps roles, with increasing levels of responsibility. Proven experience leading and managing engineering teams, including hiring, mentoring, and performance management. Strong understanding of the software development lifecycle and common developer workflows. Deep technical expertise in cloud platforms (e.g., AWS, Azure, GCP) and cloud-native technologies (e.g., Kubernetes, Docker, serverless). Extensive experience with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Significant experience designing and implementing CI/CD pipelines using tools like Jenkins, GitLab CI, GitHub Actions, CircleCI, Argo CD, or Flux CD. Solid understanding of observability principles and hands-on experience with monitoring tools (e.g., Prometheus, Grafana, Datadog), logging solutions (e.g., ELK stack, Splunk), and distributed tracing (e.g., Jaeger, Zipkin). Strong understanding of security best practices for cloud environments and containerized applications, and experience with security scanning tools and secrets management. Experience in managing and configuring Code Quality tools like SonarQube Experience in managing and configuring Git tools like Gitlab Proficiency in at least one Programming language (e.g., Python, Go) for automation. Understanding of API design principles (REST, GraphQL) and experience with building and consuming APIs. Experience with data collection and analysis to identify trends and measure the impact of platform initiatives. Excellent communication, collaboration, and interpersonal skills, with the ability to influence and build consensus across teams. Strong problem-solving and analytical abilities. Experience working in an Agile development environment. Prior experience building and maintaining an Internal Developer Platform (IDP). Hands-on experience with IDP frameworks like Backstage, including setup, configuration, plugin development, and integration with other tools. Familiarity with developer productivity frameworks and methodologies. Experience with other programming languages commonly used by development teams (e.g., Java, Node.js, C++). Experience with service mesh technologies. Knowledge of cost management and optimization in the cloud. Experience in defining and tracking developer productivity metrics. Experience with data visualization tools (e.g., Grafana, Tableau). Why do we call Seclorites Entrepreneurs not Employees? We value and support those who take the initiative and calculate risks. We have an attitude of a problem solver and an aptitude that is tech agnostic. You get to work with the smartest minds in the business. We are thriving not living. At Seclore, it is not just about work but about creating outstanding employee experiences. Our supportive and open culture enables our team to thrive. Excited to be the next Entrepreneur, apply today! Don't have some of the above points in your resume at the moment? Don't worry. We will help you build it. Let's build the future of data security at Seclore together.
Posted 2 weeks ago
9.0 - 14.0 years
25 - 40 Lacs
Pune
Hybrid
About the job We're seeking a skilled Senior Software Developer to join the analytics platform team at SAS. In this role, you will work on our core codebase, primarily in C, focusing on Compute Core and Compute Server functionalities. You will be a leader in developing the next generation of our analytic engine, helping to shape the future of SAS analytics. You will collaborate with an international team of experienced developers, bringing diverse perspectives to our projects. You will also, develop and maintain enterprise-class software used by organizations around the world. If you're passionate about software development and interested in advancing the state of analytics technology, this role might be for you! You will: Program in C and Golang in a Linux and/or Windows environment. Design and develop high quality, testable and scalable software solutions within established timelines. Be aware of and adhere to R&D best practices and processes. Actively involve other project stakeholders (e.g., managers, developers, user interface and visual designers, product managers) to ensure implementation satisfies functional requirements and is consistent with established R&D standards. Participate in project scoping and scheduling; track progress of individual tasks and alerts stakeholders of issues blocking or preventing completion of task Ensure the quality of the code you write through the development of automated tests (unit, performance, user interface). Conduct code reviews to ensure integrity and cross-product consistency. Work closely with testing by providing thorough reviews of the test plan and communicate when updates to the plan should be made to cover code changes related to enhancements, redesigns and/or bug fixes. Maintain accountability for the entire life cycle of the code including support for both internal and external consumers. In collaboration with technical writers, authors appropriate level of design and technical documentation that satisfies both internal and external consumers. Work with multiple operating systems and anticipate technical anomalies and enhancements for various environments Perform testing of software; verifies, tracks, and fixes "bugs"; modifies software design, as necessary. Determines database compatibility and develops compatible code as appropriate. Prepares feasibility studies and designs tests to determine operating characteristics of software as required What were looking for Youre curious, passionate, authentic, and accountable. These are our values and influence everything we do You have a bachelor’s degree in Computer Science or a related quantitative field You have 8 or more years of experience contributing across the full Software Development Life Cycle You’re well-versed in a broad set of languages such as C, C++, TK, Go, Java, React, JavaScript, Python You have experience with supporting tools such as Docker, Jenkins, Git, Gerrit, Hibernate, and Kubernetes You have experience with both Windows and Linux. You have experience contributing at multiple levels of the software stack. You approach every task with a quality-first mindset Additional preferences (not required): You have experience with open-source container-orchestration systems like Kubernetes You have experience with cloud architectures and at least one major public cloud provider. You have experience with agile software development. Other knowledge, skills, and abilities Exceptional aptitude for problem solving and debugging of complex multitiered software applications. Ability to pivot quickly and seamlessly as projects and business needs dictate. Detail oriented and well-organized with a strong ability to prioritize, plan, and execute tasks. Highly skilled in written and verbal communications. Comfortable working in a distributed, team-based environment
Posted 2 weeks ago
5.0 - 10.0 years
5 - 15 Lacs
Coimbatore
Work from Office
*Expertise in Linux / Windows environments *Exposure to Cloud (AWS / Azure) *Proficiency in Docker/Kubernetes *Hands-on with CI/CD Tools *Exposure to Scripting *Infrastructure as Code – Terraform or CloudFormation (preferred) Required Candidate profile We are seeking an experienced Senior DevOps Engineer to lead and optimize our infrastructure and deployment processes.
Posted 2 weeks ago
7.0 - 12.0 years
9 - 15 Lacs
Hyderabad
Work from Office
*Expertise in Linux / Windows environments *Exposure to Cloud (AWS / Azure) *Proficiency in Docker/Kubernetes *Hands-on with CI/CD Tools *Exposure to Scripting *Infrastructure as Code – Terraform or CloudFormation (preferred) Required Candidate profile We are seeking an experienced Senior DevOps Engineer to lead and optimize our infrastructure and deployment processes.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2