Jobs
Interviews

42 Pipelines Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The ideal candidate must possess strong expertise in steel fabrication, specifically in the construction of pressure vessels, boilers, industrial structures, pipelines, and steel tanks. Proficiency in interpreting technical drawings is crucial for this role, as the individual will be responsible for executing fabrication tasks with minimal supervision. Familiarity with the latest fabrication techniques in line with engineering construction drawings is essential to ensure high-quality output.,

Posted 13 hours ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

Are you passionate about web scraping and ready to take on exciting data-driven projects Actowiz Solutions is urgently hiring a skilled Senior Python Developer to join our dynamic team in Ahmedabad! You should have a minimum of 2+ years of experience in Python development, with a strong hands-on experience with the Scrapy framework. A deep understanding of XPath/CSS selectors, middleware & pipelines is essential for this role. Experience in handling CAPTCHAs, IP blocks, and JS-rendered content is also required. To excel in this role, you should be familiar with proxy rotation, user-agent switching, and headless browsers. Proficiency in working with data formats such as JSON, CSV, and databases is a must. Hands-on experience with Scrapy Splash / Selenium is highly desirable. Additionally, having good knowledge of Pandas, Docker, AWS, and Celery will be beneficial for this position. If you are enthusiastic about working on global data projects and are looking to join a fast-paced team, we would love to hear from you! To apply for this position, please send your resume to hr@actowizsolutions.com / aanchalg.actowiz@gmail.com or contact HR at 8200674053 / 8401366964. You can also DM us directly if you are interested in this opportunity. Feel free to like, share, or tag someone who might be a good fit for this role! Join us at Actowiz Solutions and be a part of our exciting journey in the field of Python development and web scraping. #PythonJobs #WebScraping #Scrapy #ImmediateJoiner #AhmedabadJobs #PythonDeveloper #DataJobs #ActowizSolutions,

Posted 1 day ago

Apply

4.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Power BI + Microsoft Fabric Lead with over 10 years of experience, you will play a key role in leading the strategy and architecture for BI initiatives. Your responsibilities will include designing and delivering end-to-end Power BI and Microsoft Fabric solutions, collaborating with stakeholders to define data and reporting goals, and driving the adoption of best practices and performance optimization. Your expertise in Power BI, including DAX, Power Query, and Advanced Visualizations, will be essential for the success of high-impact BI initiatives. As a Power BI + Microsoft Fabric Developer with 4+ years of experience, you will be responsible for developing dashboards and interactive reports using Power BI, building robust data models, and implementing Microsoft Fabric components like Lakehouse, OneLake, and Pipelines. Working closely with cross-functional teams, you will gather and refine requirements to ensure high performance and data accuracy across reporting solutions. Your hands-on experience with Microsoft Fabric tools such as Data Factory, OneLake, Lakehouse, and Pipelines will be crucial for delivering effective data solutions. Key Skills Required: - Strong expertise in Power BI (DAX, Power Query, Advanced Visualizations) - Hands-on experience with Microsoft Fabric (Data Factory, OneLake, Lakehouse, Pipelines) - Solid understanding of data modeling, ETL, and performance tuning - Ability to collaborate effectively with business and technical teams Joining our team will provide you with the opportunity to work with cutting-edge Microsoft technologies, lead high-impact BI initiatives, and thrive in a collaborative and innovation-driven environment. We offer a competitive salary and benefits package to reward your expertise and contributions. If you are passionate about leveraging Power BI and Microsoft Fabric tools to drive data-driven insights and solutions, we invite you to apply for this full-time position. Application Question(s): - What is your current and expected CTC - What is your notice period If you are serving your notice period, then what is your Last Working Day (LWD) Experience Required: - Power BI: 4 years (Required) - Microsoft Fabrics: 4 years (Required) Work Location: In person,

Posted 2 days ago

Apply

1.0 - 5.0 years

0 Lacs

jaipur, rajasthan

On-site

As a passionate and detail-oriented Salesforce Commerce Cloud Developer (B2C) with at least 2 years of experience in web development and exposure to Salesforce Commerce Cloud (Demandware), you will have the opportunity to contribute to e-commerce solutions and work in a collaborative environment. Your role will involve implementing and enhancing digital storefronts for B2C clients. Your key responsibilities will include supporting the development and customization of e-commerce websites using Salesforce Commerce Cloud (SFCC). You will work with ISML templates, JavaScript, HTML, and CSS to implement UI/UX components, as well as assist in building and modifying controllers, pipelines, and cartridges in SFCC. Additionally, you will participate in troubleshooting, debugging, and performance tuning of storefront features while collaborating with senior developers and designers to deliver high-quality solutions aligned with project goals. It will be essential to ensure that the code follows best practices and is well-documented. To excel in this role, you should have at least 2 years of experience in web development with exposure to Salesforce Commerce Cloud B2C (Demandware) and a basic understanding of ISML, JavaScript, HTML5, CSS3, and jQuery. Familiarity with Commerce Cloud cartridges, page layouts, and controllers, along with a strong understanding of e-commerce fundamentals and web standards, is required. Your ability to work collaboratively in a team, communicate effectively, and willingness to learn and grow within the Salesforce Commerce ecosystem will be key to success. While not mandatory, having a Salesforce Platform Developer I certification, exposure to Agile project environments, and familiarity with Git, Jira, or any version control and task tracking tools would be considered advantageous. This is a full-time, permanent position with benefits including Provident Fund. The work schedule may involve day shifts and evening shifts. The work location is in person. If you have a year of experience with Salesforce and meet the requirements mentioned above, we encourage you to apply and be part of our dynamic team.,

Posted 2 days ago

Apply

3.0 - 5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Description Are you interested in building large-scale distributed infrastructure for the cloud Oracles Cloud Infrastructure (OCI) team is building new Infrastructure-as-a-Service technologies that operate at large scale in a distributed multi-tenant cloud environment. Join OCI Networking to build highly scalable and customizable services offering predictable and consistent performance, isolation, and availability. https://www.oracle.com/cloud/networking/ We are a team that builds and maintains distributed services to manage OCI networks. Currently, we are looking for hands-on engineers with expertise and passion in solving difficult problems in automating monitoring and management of large fleets of networking devices. These are exciting times, and our team is undergoing rapid growth while working on many new ambitious initiatives. An engineer at any level can have significant technical and business impact. Join our team and help us build a state-of-the-art IaaS solutions. Why join OCI Networking The OCI Networking org has a culture of collaboration which welcomes new people to their ranks. We work together and help each other out, and make sure that onboarding and ramp-up experience is a great one. We focus on excellent customer experience, scalable architecture, manageable operations, and minimal technical debt, with a strong focus on reasonable on-call and a good work/life balance. Responsibilities As a Sr. Member of Technical Staff on the Network Automation team, you will help design and develop tooling and infrastructure to manage a growing fleet of networking devices. You will be one of the engineers responsible for delivering a highly available, and secure fleet of critical OCI Networking infrastructure. Our team owns onboarding new generation network technologies, deployment tooling, patching, fleet monitoring and automation, and security and access controls. We work with many partner teams in OCI to ensure our networking is best in class. As a member of our team you will be required to: Maintain and build new technologies to automate the management of distributed fleet of networking devices. This includes distributed deployment and monitoring tooling. Automate and maintain build and test systems including systems for performance and scalability testing. Improve efficiency of the deployment processes across a fast-growing number of regions through automation and scale improvements to tools and dashboards. Participate in our on-call rotation which requires monitoring our fleet and associated services. Improve our operational capabilities by developing runbooks, alarming, and building tools. Qualifications You are an expert in Linux, comfortable with Python, BASH and Java, and have embedded system knowledge and systems engineering experience. You value simplicity and scale, work comfortably in a collaborative, agile environment, and are excited to learn. Basic Qualifications: Bachelors in computer science and Engineering or related engineering fields 3+ years of experience with Linux System Engineering 2+ years of experience with Python/BASH/Java 1+ years of DevOps experience Proficient with build tools and pipelines (e.g. Team City, Maven, make) Preferred Qualifications: 1+ years of experience with embedded systems Experience in CICD environments Experience with Agile Development Prior cloud experience Hardware qualification experience (embedded development) Experience automating management of networking devices Career Level - IC3 Responsibilities As a member of the software engineering division, you will assist in defining and developing software for tasks associated with the developing, debugging or designing of software applications or operating systems. Provide technical leadership to other software developers. Specify, design and implement modest changes to existing software architecture to meet changing needs. About Us As a world leader in cloud solutions, Oracle uses tomorrows technology to tackle todays challenges. Weve partnered with industry-leaders in almost every sectorand continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. Thats why were committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. Were committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing [HIDDEN TEXT] or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 days ago

Apply

6.0 - 8.0 years

19 - 35 Lacs

Hyderabad

Work from Office

We are hiring a Senior Data Engineer with 6- 8 years of experience Education:- Candidates from premier institutes like IIT, IIM, IISc, NIT, IIIT top- ranked institutions in India are highly encouraged to apply.

Posted 3 days ago

Apply

9.0 - 13.0 years

10 - 20 Lacs

Ahmedabad

Work from Office

Dynamics 365 CRM/Power Platform Technical Lead Job Summary We are seeking an experienced Dynamics 365 CE/Power Platform Technical Lead to drive the design, development, and implementation of enterprise-grade solutions. The ideal candidate will have 8+ years of total experience, with at least 5+ years of relevant experience in Dynamics 365 CRM, Power Platform, and Microsoft Azure integrations. The role requires a strong technical background, hands-on expertise in Solution Architecture, and a deep understanding of at least 2 modules from Sales, Customer Service, Field Service, and Customer Insights. The candidate should also have experience with Azure services, DevOps, and system integrations, along with excellent communication and leadership skills. Key Responsibilities Solution Design • Lead the design, and development of Dynamics 365 CRM and Power Platform solutions. • Define best practices, governance, and security strategies for Power Apps and Power Automate. • Design and implement integrations with Azure services and third-party applications. • Ensure scalability, security, and performance optimization of implemented solutions. Development & Customization • Develop and extend Dynamics 365 CE applications using C#, JavaScript, TypeScript, Plugins, and PCF Controls. • Design and implement custom Canvas and Model-Driven Apps. • Configure and automate business processes using Power Automate. • Work with Dataverse (CDS) for efficient data modelling and security management. • Develop Azure-based integrations using Azure Functions, Logic Apps, and API Management. • Implement DevOps practices using Azure DevOps, Git, and CI/CD pipelines. Integration & Data Management • Build integrations using REST APIs, Webhooks, and SSIS. • Design and manage Dataflows for seamless data processing across platforms. • Ensure smooth communication between Dynamics 365 CRM and external systems. Technical Leadership & Team Management • Lead and mentor a team of Dynamics 365 CRM and Power Platform developers. • Conduct code reviews, enforce coding standards, and provide technical guidance. • Work closely with business analysts, functional consultants, and stakeholders to align technical solutions with business needs. • Conduct feasibility analysis and provide expert recommendations. Required Skills & Experience Dynamics 365 CRM Modules: Sales, Customer Service, Field Service, Customer Insights. Power Platform: Power Apps (Canvas & Model-Driven), Power Automate. Development: C#, JavaScript, TypeScript, Plugins, Custom Connectors, PCF Controls. Azure Services: Azure Functions, Logic Apps, API Management, Application Insights, Service Bus. DevOps & CI/CD: Azure DevOps, Git, Pipelines, Automated Deployments. Integration: REST APIs, Webhooks, SSIS, Dataflows

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You should have experience in developing business solutions using Salesforce Commerce Cloud (Demandware) and possess good knowledge of SFCC / Demandware Script, Java OOP/Node.js. Expertise in RESTful web services, Ajax-driven client applications, and Advanced JavaScript architectures is essential. Additionally, you should have a good understanding of SFRA and strong background in World Wide Web Consortium (W3C), accessibility guidelines, model view controller (MVC), agile development, and related tools. Your role will require experience in using Pipelines, JS controllers, ISML templates, and real-time data exchange via web services. You should be familiar with configuring jobs in Business Manager and have a strong SQL knowledge. Exposure to technical reports, understanding of Demandware limitations such as quotas, and proficient in Demandware foundational concepts are desired. Prior experience with SiteGenesis, UX Studio, content assets or slots, and Demandware Catalog will be advantageous. SFCC certifications are preferred. Effective written and verbal communication skills are necessary, along with the ability to work independently and complete tasks efficiently. Candidates who are available to join sooner will be given preference. Key Skills and Experience: - SFRA Framework - Node JS, Demandware Script - API integrations and Service Framework - Job Framework - Third-party integration - OCAPI and SCAPI - MVC framework - Analytical Skills - Debugging and Troubleshooting Your proficiency in the above-mentioned skills will be crucial for excelling in this role.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Engineer plays a critical role in the organization by designing, building, and maintaining scalable data pipelines and infrastructure. Collaborating closely with cross-functional teams, you ensure the smooth flow of data and enhance data-driven decision-making. Your key responsibilities include designing, developing, and maintaining data pipelines and ETL processes using tools such as Snowflake, Azure, AWS, Data Bricks, Informatica, and DataStage. You will work with data scientists and stakeholders to understand data requirements, ensuring data availability and integrity. Additionally, optimizing and tuning the performance of data infrastructure and processing systems, implementing data security and privacy measures, troubleshooting and performance tuning of ETL processes, and developing documentation for data infrastructure and processes are crucial aspects of your role. You will also participate in the evaluation and selection of new technologies and tools to enhance data engineering capabilities, provide support and mentorship to junior data engineers, adhere to best practices in data engineering, and maintain high standards of quality. Collaboration with cross-functional teams to support data-related initiatives and projects is essential for success in this role. To qualify for this position, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with proven experience in data engineering, ETL development, and data warehousing. Proficiency in Snowflake, AWS, Azure, Data Bricks, Informatica, and DataStage, strong programming skills in languages like Python, SQL, or Java, experience with big data technologies and distributed computing, and knowledge of data modeling and database design principles are required. Your ability to work with stakeholders, understanding data requirements, translating them into technical solutions, and knowledge of data governance, data quality, and data integration best practices are critical. Experience with cloud data platforms and services, excellent problem-solving and analytical abilities, strong communication and collaboration skills, and the ability to thrive in a fast-paced and dynamic environment are essential for success. Relevant certifications in cloud platforms and data engineering, such as AWS Certified Big Data - Specialty, Microsoft Certified: Azure Data Engineer, and SnowPro Core Certification, will be advantageous. In summary, as a Data Engineer, you will play a vital role in designing, building, and maintaining data pipelines and infrastructure, collaborating with cross-functional teams, optimizing data performance, and ensuring data security and privacy to support data-driven decision-making and initiatives effectively.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

Are you passionate about web scraping and ready to take on exciting data-driven projects Actowiz Solutions is urgently hiring a skilled Senior Python Developer to join our dynamic team in Ahmedabad! We are looking for someone with a strong hands-on experience with the Scrapy framework and a deep understanding of XPath/CSS selectors, middleware & pipelines. Experience in handling CAPTCHAs, IP blocks, and JS-rendered content is essential. Additionally, familiarity with proxy rotation, user-agent switching, and headless browsers is a plus. Proficiency in data formats such as JSON, CSV, and databases is required. Hands-on experience with Scrapy Splash / Selenium, along with good knowledge of Pandas, Docker, AWS, and Celery will be beneficial for this role. If you are an immediate joiner with at least 2+ years of experience in Python development and meet the above requirements, we encourage you to apply! This position is based in Ahmedabad and requires working from the office. To apply, please send your resume to hr@actowizsolutions.com / aanchalg.actowiz@gmail.com or contact HR at 8200674053 / 8401366964. Alternatively, you can also DM directly. If you are ready to join a fast-paced team and work on global data projects, we would love to hear from you! Feel free to like, share, or tag someone who might be a fit for this position. #PythonJobs #WebScraping #Scrapy #ImmediateJoiner #AhmedabadJobs #PythonDeveloper #DataJobs #ActowizSolutions,

Posted 4 days ago

Apply

7.0 - 12.0 years

0 Lacs

karnataka

On-site

You have an exciting opportunity with a global IT services and consulting company headquartered in Tokyo, Japan. The company excels in offering a wide array of IT services, including application development, infrastructure management, and business process outsourcing. Their expertise extends to consulting services focusing on business and technology, as well as digital solutions emphasizing transformation and user experience design. With a strong emphasis on data and intelligence services, they specialize in analytics, AI, and machine learning. Their comprehensive portfolio also includes cybersecurity, cloud, and application services tailored to meet the diverse needs of businesses worldwide. The ideal candidate for this position should have 7 to 12 years of experience with ITX, including Client App Connect Enterprise (ACE) v12 and Client Transformation Extender (ITX) v10. Strong development experience and knowledge in ESQL, troubleshooting, debugging, and root cause analysis are essential. Proficiency in Client MQ v9, Oracle for complex SQL queries, and Linux for working with Client ACE on the Linux platform are required. Experience with .GIT, Azure DevOps (ADO), pipelines, and automation is highly desirable. Previous experience working at Humana would be a significant advantage. Interested candidates are encouraged to submit their updated resumes for consideration. For more job opportunities, please visit Jobs In India - VARITE. If this opportunity is not suitable for you, feel free to share it with your network. VARITE offers a Candidate Referral program where you can earn rewards by referring qualified candidates who complete a three-month assignment with the company. About VARITE: VARITE is a global staffing and IT consulting company that provides technical consulting and team augmentation services to Fortune 500 Companies in the USA, UK, Canada, and India. As a primary and direct vendor to leading corporations in various verticals, including Networking, Cloud Infrastructure, Digital Marketing, Utilities, Financial Services, and more, VARITE offers equal opportunities to all candidates. This position offers a unique opportunity to work with a dynamic team in a global IT services and consulting company. If you meet the qualifications and are looking to advance your career in a fast-paced and innovative environment, we encourage you to apply and explore this exciting opportunity further.,

Posted 4 days ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune

Hybrid

Job Description: JOB SUMMARY We are seeking an experienced Microsoft Fabric architect that brings technical expertise and architectural instincts to lead the design, development, and scalability of our secured enterprise-grade data ecosystem. This role is not a traditional BI/Data Engineering position we are looking for deep hands-on expertise in Fabric administration, CI/CD integration, and security/governance configuration in production environments. ESSENTIAL DUTIES Provide technical leadership on design and architectural decisions, data platform evolution and vendor/tool selection Leverage expertise in data Lakehouse on Microsoft Fabric, including optimal use of OneLake, Dataflows Gen2, Pipelines and Synapse Data Engineering Build and maintain scalable data pipelines to ingest, transform and curate data from a variety of structured and semi-structured sources Implement and enforce data modelling standards, including medallion architecture, Delta Lake and dimensional modelling best practices Collaborate with analysts and business users to deliver well-structured, trusted datasets for self-service reporting and analysis in Power BI Establish data engineering practices that ensure reliability, performance, governance and security Monitor and tune workloads within the Microsoft Fabric platform to ensure cost-effective and efficient operations EDUCATION / CERTIFICATION REQUIREMENTS Bachelor’s degree in computer science, data science, or a related field is required. A minimum of 3 years of experience in data engineering with at least 2 years in a cloud-native or modern data platform environment is required. Prior experience with a public accounting, financial or other professional services environment is preferred. SUCCESSFUL CHARACTERISTICS / SKILLS Extensive, hands-on expertise with Microsoft Fabric, including Dataflows Gen2, Pipelines, Synapse Data Engineering, Notebooks, and OneLake. Proven experience designing Lakehouse or data warehouse architecture, including data ingestion frameworks, staging layers and semantic models. Strong SQL and T-SQL skills and familiarity with Power Query (M) and Delta Lake formats. Understanding of data governance, data security, lineage and metadata management practices. Ability to lead technical decisions and set standards in the absence of a dedicated Data Architect. Strong communication skills with the ability to collaborate across technical and non-technical teams. Results driven; high integrity; ability to influence, negotiate and build relationships; superior communications skills; making complex decisions and leading team through complex challenges. Self-disciplined to work in a virtual, agile, globally sourced team. Strategic, out-of-the-box thinker and problem-solving experience to assess, analyze, troubleshoot, and resolve issues. Excellent analytical skills, extraordinary attention to detail, and ability to present recommendations to business teams based on trends, patterns, and modern best practices. Experience with Power BI datasets and semantic modelling is an asset. Familiarity with Microsoft Purview or similar governance tools is an asset. Working knowledge of Python, PySpark, or KQL is an asset. Experience and passion for technology and providing exceptional experience both internally for our employees and externally for clients and prospects. Strong ownership, bias to action, and know-how to succeed in ambiguity. Ability to deliver value consistently by motivating teams towards achieving goal Do share your resume with my email address: sachin.patil@newvision-software.com Please share your experience details: Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice / Serving (LWD): Any Offer in hand: LPA Current Location Preferred Location: Education: Please share your resume and the above details for Hiring Process: - sachin.patil@newvision-software.com

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Do you want to design and build attractive digital products and services Do you aspire to play a key role in transforming our firm into an agile organization At UBS, the reimagination of our work processes, connections with colleagues, clients, and partners, and value delivery is ongoing. Embracing agility will enhance our responsiveness, adaptability, and innovation. As an individual in this role, your responsibilities will include: - Designing, developing, and enhancing digital products and technology services for clients and employees - Applying a wide range of software engineering practices, from user needs analysis to feature development, automated testing, and deployment - Ensuring quality, security, reliability, and compliance of solutions by implementing digital principles and meeting functional and non-functional requirements - Incorporating observability into solutions, monitoring production health, assisting in incident resolution, and addressing root cause of risks and issues - Understanding, representing, and advocating for client needs - Sharing knowledge and expertise with colleagues, participating in hiring processes, and contributing to engineering culture and internal communities regularly In the agile operating model at UBS, teams are aligned with larger products and services to meet client needs and consist of multiple autonomous pods. You will be a part of the Save, protect, and grow team in Pune, focusing on the Advisory Portfolio Management business area, which includes portfolio management, model management, trading, and reconciliation. This team, spread globally, manages a significant percentage of Assets Under Management (AUM) for UBS. Your expertise should include: - Strong understanding of rdbms concepts, pl/sql, data modeling, and data design - Hands-on experience in query optimization - Working knowledge of postgressql - Experience in ETL and ELT processing using tools like Azure Data Factory (ADF) - Good knowledge of Azure environment, containerization concepts, cloud-native development practices, Azure wizard, etc. - Solid grasp of DevOps and CI/CD concepts (GitLab, TeamCity, automated testing, pipelines, Ansible, Rex, Maven, IntelliJ, Sonar) - Resilient team player with strong interpersonal skills, driving initiatives independently - Experience in gitlab-facilitated software delivery UBS, the world's largest and only truly global wealth manager, operates through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in major financial centers across more than 50 countries, our global reach and expertise differentiate us from competitors. As part of our hiring process, you may be requested to complete assessments. At UBS, flexible working arrangements such as part-time, job-sharing, and hybrid (office and home) options are available when feasible. Our purpose-led culture and global infrastructure enable us to collaborate and work agilely to meet business needs. Great work is achieved through collaborative efforts. Join #teamUBS and contribute to making an impact with us. UBS is an Equal Opportunity Employer that values and empowers individuals from diverse backgrounds, cultures, perspectives, skills, and experiences within our workforce.,

Posted 1 week ago

Apply

3.0 - 8.0 years

13 - 19 Lacs

Bengaluru

Work from Office

Dear Candidate, Please find below the open roles with one of our clients on a full-time basis: Role 1: Developer Linux Audio Device Primary Skills: Board Support Package (BSP), C Language, FreeRTOS Experience: 310 Years Education: B.Tech/BE Location: Bangalore Other Skills: C, JTAG, Signal Analyzers Role 2: Developer – Linux Multimedia Codec Integration Experience: 3–10 Years Education: B.Tech/BE Location: Bangalore Skills: C, C++, Linux drivers, algorithms, pipelines, H264, VP9, HEVC, Linux V4L2, GStreamer, Android, OpenGL, V4L2, DRM, Linux threads, system calls, serialization mechanisms, embedded Linux user space applications, GDB, KDB, Trace Role 3: Developer – Linux Display Experience: 3–10 Years Education: B.Tech/BE Location: Bangalore Skills: C, C++, Linux drivers, algorithms, pipelines, DRM/KMS, Kernel Drivers, HDMI, MIPI DSI protocol, DSI Panels, White Balance, Histogram, Color Correction, image formats, Wayland/Weston, Linux threads, system calls, serialization mechanisms, embedded Linux user space applications, GDB, KDB, Trace Role 4: Developer – Linux Camera Pipeline Experience: 3–10 Years Education: B.Tech/BE Location: Bangalore Skills: C, C++, Linux drivers, algorithms, pipelines, Histogram, 3A algorithms, Color Correction, image formats, Media Controller (Open Source), V4L2, CSI2, GStreamer, OpenGL, DRM, Linux threads, system calls, serialization mechanisms, embedded Linux user space application development, GDB, KDB, Trace If you are interested, please share your updated CV along with the following details to viharika@precisiontechcorp.com : Full Name: Total Experience: Relevant Experience: Official Notice Period: Negotiable Notice Period: Last Working Day (if serving notice): Current CTC: Expected CTC: Any Offers in Hand (Yes/No): Reason for Change: Reason for Considering Another Offer (if applicable): Current Location: Preferred Location: Looking forward to your response. Best regards, Viharika viharika@precisiontechcorp.com

Posted 1 week ago

Apply

6.0 - 9.0 years

0 - 0 Lacs

pune

On-site

We are hiring for Spark Developer for India MNC Profile: Spark Developer Experience: 6 YRS TO 9 YRS Locations : Pune Responsibilities: Spark with Scala/Python/Java Experience and Expertise in any of the following Languages at least 1 of them : Java, Scala, Python Experience and expertise in SPARK Architecture Experience in the range of 8-10 yrs plus Good Problem Solving and Analytical Skills Ability to Comprehend the Business requirement and translate to the Technical requirements Good communication and collaborative skills with fellow team and across Vendors Familiar with development of life cycle including CI/CD pipelines. Proven experience and interested in supporting existing strategic applications Familiarity working with agile methodology Interested candidate Can there update cv or resume on this no 9582342017 or vimhr11@gmail.com Regards, Kirti Shukla HR Recruiter

Posted 1 week ago

Apply

10.0 - 15.0 years

6 - 7 Lacs

Bhiwadi

Work from Office

Role & responsibilities:- Overseeing & Managing plant maintenance both routine & preventive. Management of staff & workmen to get the job done. Preferred candidate profile :- Preference to BE/ Diploma in Mechanical Engineering candidates aged between 40-45 , having 10~15 years work experience in chemical process plant and having thorough knowledge of working & maintenance of chemical process equipment and electrical equipment. Capable of getting work done. Willing to lead by example. Preference to married candidates residing or willing to reside in Bhiwadi Perks and benefits :- Good salary & perks. House Rent & Conveyance for stay in Bhiwadi. Annual bonus 8.33% or as per company policy. Annual increment after successful completion of 1 year working post job confirmation.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

The next step of your career starts here, where you can bring your own unique mix of skills and perspectives to a fast-growing team. Metyis is a global and forward-thinking firm operating across a wide range of industries, developing and delivering AI & Data, Digital Commerce, Marketing & Design solutions, and Advisory services. At Metyis, our long-term partnership model brings long-lasting impact and growth to our business partners and clients through extensive execution capabilities. With our team, you can experience a collaborative environment with highly skilled multidisciplinary experts, where everyone has room to build bigger and bolder ideas. Being part of Metyis means you can speak your mind and be creative with your knowledge. Imagine the things you can achieve with a team that encourages you to be the best version of yourself. We are Metyis. Partners for Impact. Opportunity to accelerate the pace of digitalization & ecommerce growth through advanced technology, business intelligence, and analytics. Driving high-impact insights enhancing decision-making across the entire organization. Driving brand equity and digital sales through enhanced digital experiences. Interaction with senior business and ecommerce leaders on a regular basis to drive their business towards impactful change. Become part of a fast-growing international and diverse team. Must understand eCommerce architecture, product, order, and inventory flows in order to create detailed designs, architectural documents, and develop complex applications. Work with Salesforce Commerce Cloud, SFCC, and partners for any site changes. Code and Deploy Applications in a Cross-Platform, Cross-Browser Environment. Be able to support the full code review and release management for SFCC. Work with Salesforce Commerce Cloud (formerly Demandware) to develop and implement new features. Write clean code that can be well tested Analyze and improve site speed and stability. Document technical and functional specifications. Guide and mentor team members. 2 - 5 years of experience. Professional Experience in developing business solutions using Salesforce Commerce Cloud (Demandware). Experience using Pipelines, JS Controllers, ISML Templates, and Real-time data exchange using Web services. Experience in integrating custom services into SFCC. Good understanding of Business Manager, configuring jobs. Exposure to technical reports and understanding of Demandware limitations such as quotas. Understanding of all Administrative options/tasks/interfaces available in Business Manager. Proficient in Demandware foundational concepts with knowledge of Site Genesis, UX Studio, Content Assets / Slots, and Demandware catalogue. SFCC certification is preferred. Extensive knowledge of SFCC core functions like Business Manager, Store location, Configuration feed, Merchandising, Storefront configuration, Integrating DW cartridges, Developing Custom Cartridge, Creating REALM and management. Good knowledge of Integration frameworks like service frameworks, Job framework, Integration framework. Good knowledge in build and releases process. Should have been part of a minimum of 2 SFCC implementations. Good to have experience in other platforms like HCL Commerce, SAP Commerce cloud, Magento, and DevOps. Ability to innovate and to effectively deal with rapid change in a positive manner. Excellent analytical thinking, problem-solving, organizational, and time management skills. Strong initiative, proactive, and ability to meet deadlines. Strong written and oral communication skills in English. Willingness to travel for business meetings, as required.,

Posted 1 week ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

Hyderabad

Work from Office

"Mechanical Fitter: ITI Fitter, 4-6 yrs exp. Perform fitting jobs on reactors, pipelines, valves & agitators. Routine maintenance, breakdown support, safety protocols."

Posted 1 week ago

Apply

14.0 - 18.0 years

16 - 20 Lacs

Pune, Bengaluru

Work from Office

We are looking for a highly experienced Senior Java Developer with strong hands-on skills in Java Spring Boot and Kafka. The candidate should be passionate about building scalable, robust applications and must be able to independently handle hands-on development work. Primary Responsibilities 80% Hands-on Development with Java Spring Boot Interact directly with clients, requiring good communication skills Build & maintain Kafka-based integrations Apply basic CI/CD & DevOps practices Contribute to architecture discussions and performance tuning Troubleshoot and optimize application performance Key Skills Required Java (Spring Boot) 7 to 8+ years of deep hands-on experience Kafka Minimum 2 to 3 years (mandatory) CI/CD and DevOps knowledge Git, Jenkins, pipelines, basic scripting Strong problem-solving and debugging skills Excellent communication for client-facing collaboration Total industry experience 14+ years

Posted 1 week ago

Apply

6.0 - 10.0 years

2 - 5 Lacs

Hyderabad, Telangana, India

On-site

Should have minimum exp of 6+ years to 10+ years Requirement of good Postgres Senior Developer/ Architect with 6+ years of experience on working with PostgreSQL. Excellent understanding of RDBMS Concepts, PL/SQL, Data Modelling and Data design. Hands on experience in query optimization Working knowledge of PostgresSQL is a must Experienced in ETL and ELT processing via tools like Azure Data Factory (ADF) Good knowledge of Azure environment, containerization concepts, cloud native development practices, Azure Wizard etc. Excellent understanding of DevOps and CI/CD concepts (GitLab, TeamCity, Automated Testing, Pipelines, Ansible, Rex, Maven, IntelliJ, SONAR) Excellent understanding of Agile methodologies and hands-on delivery experience in a pod setup Hands on experience in Database migration projects from Mainframe DB2 (on z/OS) to Azure PostgreSQL database Migration hands on experience of at least 1 project from DB2 database objects like Schemas, Data, Stores Procedures and Constraints Good to Have exposure in migrating Mainframe DB2 database objects into PostgreSQL equivalent objects (like Schemas, Tables, Constraints, Views, Indexes, Triggers, Sequence, Usage Lists) Experience in converting DB2 Storage Procedures functionality into Postgres equivalent SQL code OR writing new Postgres Procedures / APIs Optimizing SQL queries and database configurations for PostgreSQL Ability to write database queries and code while maintaining data quality, privacy, and security

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at our company, you will be responsible for building and maintaining secure, scalable data pipelines using Databricks and Azure. Your role will involve handling ingestion from diverse sources such as files, APIs, and streaming data, performing data transformation, and ensuring quality validation. Additionally, you will collaborate closely with subsystem data science and product teams to ensure ML readiness. To excel in this role, you should possess the following skills and experience: - Technical proficiency in Notebooks (SQL, Python), Delta Lake, Unity Catalog, ADLS/S3, job orchestration, APIs, structured logging, and IaC (Terraform). - Delivery expertise in trunk-based development, TDD, Git, and CI/CD for notebooks and pipelines. - Integration knowledge encompassing JSON, CSV, XML, Parquet, SQL/NoSQL/graph databases. - Strong communication skills enabling you to justify decisions, document architecture, and align with enabling teams. In return for your contributions, you will benefit from: - Proximity Talks: Engage with other designers, engineers, and product experts to learn from industry leaders. - Continuous learning opportunities: Work alongside a world-class team, challenge yourself daily, and expand your knowledge base. About Us: Proximity is a trusted technology, design, and consulting partner for prominent Sports, Media, and Entertainment companies globally. Headquartered in San Francisco, we also have offices in Palo Alto, Dubai, Mumbai, and Bangalore. Since 2019, our team at Proximity has developed high-impact, scalable products used by 370 million daily users, with a total net worth of $45.7 billion among our client companies. Join our diverse team of coders, designers, product managers, and experts at Proximity. We tackle complex problems and build cutting-edge tech solutions at scale. As part of our rapidly growing team of Proxonauts, your contributions will significantly impact the company's success. You will have the opportunity to collaborate with experienced leaders who have spearheaded multiple tech, product, and design teams. To learn more about us: - Watch our CEO, Hardik Jagda, share insights about Proximity. - Discover Proximity's values and meet some of our Proxonauts. - Explore our website, blog, and design wing - Studio Proximity. - Follow us on Instagram for behind-the-scenes content: @ProxWrks and @H.Jagda.,

Posted 1 week ago

Apply

5.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Service Reliability Engineer at Proofpoint, you will develop a deep understanding of the various services and applications that come together to deliver Proofpoint's next-generation security products. Your primary responsibility will be maintaining and extending the Elasticsearch and Splunk clusters used for critical near-real-time data analysis. This role involves continually evaluating the performance of these clusters, identifying and addressing developing problems, planning changes for high-load events, applying security fixes, testing and performing upgrades, as well as enhancing the monitoring and alert infrastructure. You will also play a key role in maintaining other components of the data pipeline, which may involve serverless or server-based systems for data ingestion into the Elasticsearch pipeline. Optimizing cost vs. performance will be a focus, including testing new hosts or configurations. Automation is a priority, utilizing tools like Puppet and various scripting mechanisms to achieve a build once/run everywhere system. Your work will span various types of infrastructure, including public cloud, Kubernetes clusters, and private data centers, providing exposure to diverse operational environments. Building effective partnerships across different teams within the organization, such as Product, Engineering, and Operations, is crucial. Participation in an on-call rotation and addressing escalated issues promptly are also part of the role. To excel in this position, you are expected to have a Bachelor's degree in computer science, information technology, engineering, or a related discipline. Your expertise should include proficient administration and management of Elasticsearch clusters, with secondary experience in managing Splunk clusters. Proficiency in provisioning and Configuration Management tools like Puppet, Ansible, and Rundeck is essential. Experience in building Automations and Infrastructure as Code using tools like Terraform, Packer, or CloudFormation templates is a plus. You should also be familiar with monitoring and logging tools such as Splunk, Prometheus, and PagerDuty, as well as scripting languages like Python, Bash, Go, Ruby, and Perl. Experience with CI/CD tools like Jenkins, Pipelines, and Artifactory will be beneficial. An inquisitive mind, effective troubleshooting skills, and the ability to navigate a complex system to extract meaningful data are essential qualities for success in this role. In addition to a competitive salary and benefits package, Proofpoint offers a culture focused on talent development, regular promotion cycles, company-sponsored education, and certifications. You will have the opportunity to work with cutting-edge technologies, participate in employee engagement initiatives, and benefit from annual health check-ups and insurance coverage. The company is committed to fostering diversity and inclusion in the workplace, offering hybrid work options, flexible hours, and inclusive facilities to support employees with diverse needs. Persistent Ltd. is an Equal Opportunity Employer that values diversity and prohibits discrimination and harassment. Join us to accelerate your growth professionally and personally, make a positive impact using the latest technologies, and collaborate in an innovative and inclusive environment to unlock global opportunities for learning and development. Let's unleash your full potential at Persistent.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Gen AI developer with experience in production level deployment, you will be responsible for designing and implementing LLMs at scale. You will participate as a team member in fully agile Scrum deliveries and work on full stack AI/ML design, building and maintaining efficient and reliable Gen AI code leveraging pipelines. Additionally, you will utilize your hosting and deployment knowledge in GCP along with advanced engineering concepts to build a user-friendly UI interface for easy adoption. To be successful in this role, you should have 4-5 years of overall experience with significant exposure in LLM development, design, architecture, scaling, and hosting. It is essential to have experience with Python, LLM models, small-sized LLMs, code-based LLM models, pipelines like Langchain/Ollama, embeddings, memory management, tokenization, and frameworks like RAG. Previous experience in cloud hosting, either AWS, Azure, or GCP, is required. You will also be expected to implement a feedback mechanism to continually improve the model over time through feedback.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,

Posted 2 weeks ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: Design, implement, and maintain CI/CD pipelines using Azure DevOps (Repos, Pipelines, Artifacts). Automate infrastructure provisioning using tools like ARM templates, Terraform, or Bicep. Collaborate with developers to streamline code deployment, version control, and testing. Manage and monitor Azure cloud infrastructure for scalability, performance, and security. Implement release strategies, branching models, and environment management. Integrate DevOps practices with tools such as GitHub, Docker, Kubernetes, and Azure Monitor. Troubleshoot build and deployment issues across environments. Key Skills Required: Azure Pipelines (YAML and Classic) Azure Repos (Git), Boards, Artifacts Release and deployment automation Integration with GitHub, Jira, or Jenkins (optional)

Posted 2 weeks ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies