Home
Jobs

17329 Scripting Jobs - Page 10

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Technical LMS Administrator – SuccessFactors Location: Gurgaon Experience: 7-10 Years Job Description We are seeking a skilled and detail-oriented Technical LMS Administrator with hands-on experience in SAP SuccessFactors Learning Management System. The ideal candidate will be responsible for the technical administration, configuration, and support of the LMS, as well as managing API integrations, conducting tool testing, and ensuring seamless integration with various learning and HR systems. Key Responsibilities Serve as the primary technical administrator for SAP SuccessFactors LMS, including user management, course deployment, system configurations, and troubleshooting. Manage training assignments, schedules, notifications, and assessments to ensure timely delivery of learning programs. Troubleshoot technical issues related to the LMS and provide prompt resolution to ensure uninterrupted access for users. Develop and maintain APIs and integrations between SuccessFactors and external/internal tools, platforms, and systems. Collaborate with IT, HR, and Learning & Development teams to identify integration requirements and deliver scalable solutions. Conduct testing, quality assurance, and validation of new tools and updates prior to implementation. Monitor system performance, manage technical issues, and coordinate with SAP Support as needed. Support data migrations, custom reports, dashboards, and analytics as per business requirements. Maintain system documentation, including process flows, configuration guides, and integration architecture. Stay updated on SuccessFactors releases, evaluate impact on existing setup, and apply updates/configurations accordingly. Ensure data security, privacy, and compliance with organizational and legal standards. Education & Experience Bachelor’s degree in computer science, Information Systems, or a related field. 7-10 years of experience managing LMS platforms, preferably SuccessFactors. Proven experience with API integrations (REST/SOAP), SFTP, and middleware platforms (e.g., SAP CPI, Boomi, MuleSoft). Familiarity with SCORM, AICC, xAPI, and other e-learning standards. Technical Skills Strong knowledge of SuccessFactors Learning administration and architecture. Proficient in XML, JSON, Postman, and API testing/debugging tools. Basic scripting or programming knowledge (e.g., Python, JavaScript) is a plus. Experience with Single Sign-On (SSO), user authentication, and data privacy protocols. Soft Skills Excellent problem-solving and troubleshooting skills. Strong communication and collaboration abilities. Ability to manage multiple projects and priorities effectively. Show more Show less

Posted 7 hours ago

Apply

0.0 - 2.0 years

0 Lacs

Calicut, Kerala

On-site

Indeed logo

We have an exciting opportunity for a highly motivated QA Automation Engineer with a strong background in automation testing. In this role, you will be responsible for ensuring the quality of various applications by designing and implementing automated test solutions. You will collaborate with developers, business analysts, and project managers to shape and verify code and ensure conformance to system requirements. If you're passionate about automation testing and have 2 years of relevant experience, we'd like to hear from you. Primary Responsibilities: Automation Testing: Develop, maintain, and execute automated test scripts using industry-standard automation tools and frameworks. Test Strategy and Design: Assist in the design of test strategies, test cases, and test data, focusing on automation wherever possible. Test Script Development: Write and document automated test scripts based on functional profiles and test requirements. Test Execution: Execute automated tests, monitor test results, and report defects, ensuring timely delivery of high-quality software. Collaboration: Collaborate with cross-functional teams to review test plans, strategies, and ensure comprehensive test coverage, including unit, functional, performance, stress, and acceptance testing. Defect Management: Assist in managing and maintaining defect tracking processes, working closely with development teams. Continuous Learning: Stay up-to-date with emerging automation testing tools and technologies, and evaluate their applicability to enhance testing processes. Metrics Reporting: Collect and report meaningful test metrics to assess test efficiency and effectiveness. Key Qualifications: 2+ years of hands-on experience in automation testing, with a proven track record of delivering high-quality software applications. Proficiency in automation testing tools such as Selenium, Appium, or similar. Strong programming skills, preferably in languages like Java, Python, or other scripting languages. Familiarity with defect tracking tools, such as JIRA, Mantis, or equivalent. Experience with system integration, release management, and automation testing in web and mobile applications. Knowledge of API testing using tools like Postman or Swagger. Understanding of database testing, including the ability to write SQL queries. Good communication skills, both written and verbal, with the ability to interact effectively with team members and clients. Experience with test management tools, such as TestRail. Knowledge of cloud testing is a plus. If you are a self-motivated, problem-solving automation engineer with a passion for technology and a strong desire to contribute to the success of our projects, we encourage you to apply. Join our dynamic team and help us ensure the delivery of high-quality software solutions to our clients. Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹20,000.00 per month Benefits: Paid sick time Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Ability to commute/relocate: Calicut, Kerala: Reliably commute or planning to relocate before starting work (Required) Experience: total work: 2 years (Preferred)

Posted 7 hours ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Ciena is committed to our people-first philosophy. Our teams enjoy a culture focused on prioritizing a personalized and flexible work environment that empowers an individual’s passions, growth, wellbeing and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute As the CISO & Executive Metrics and Reporting Analyst , you will report directly to the Chief Information Security Officer (CISO) and play a pivotal role in shaping and communicating the security posture of the organization. You will be responsible for developing and managing a comprehensive security metrics and reporting framework that supports executive decision-making and regulatory compliance. Key Responsibilities Define, track, and analyze key performance and risk indicators (KPIs/KRIs) aligned with security goals and frameworks (e.g., NIST, ISO 27001). Deliver regular and ad-hoc executive-level reports and dashboards that translate complex security data into actionable insights. Collect and analyze data from SIEM systems, security tools, and incident reports to support risk management and strategic planning. Collaborate with IT, compliance, and business units to align on metrics and reporting requirements. Continuously improve reporting processes and stay current with cybersecurity trends and best practices. The Must Haves Education: Bachelor’s degree in Computer Science, Information Systems, Cybersecurity, or a related field. A Master’s degree is a plus. Experience: Minimum 5 years in cybersecurity metrics and reporting, preferably in an executive-facing role. Experience with data visualization tools (e.g., Power BI, Tableau, Excel). Familiarity with SIEM systems (e.g., Splunk) and cybersecurity frameworks (e.g., NIST, ISO 27001). Proficiency in SQL and experience with Snowflake for data warehousing.: Strong analytical skills with the ability to interpret complex data sets. Experience with ETL processes and Python scripting is a plus. Excellent written and verbal communication skills, with the ability to present to non-technical stakeholders. Assets Relevant certifications such as CISSP, CISM, or CRISC. Experience working in cross-functional teams and influencing stakeholders. Strategic thinking and adaptability to evolving security threats and technologies. Strong attention to detail and a proactive approach to problem-solving. Passion for continuous improvement and innovation in cybersecurity reporting. Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require. Show more Show less

Posted 7 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Overview: As a Senior Technical Consultant at Hashmato, you will play a critical role in designing, implementing, and supporting technology solutions for clients. You will work closely with customers, internal teams, and partners to deliver high-quality technical services, ensuring optimal performance and alignment with business goals. Key Responsibilities: ✅ Understand client requirements and design appropriate technical solutions ✅ Implement, configure, and integrate Hashmato products or relevant technologies ✅ Provide technical support and troubleshooting for existing implementations ✅ Collaborate with development, QA, and product teams to deliver solutions ✅ Create technical documentation, user guides, and reports ✅ Assist in pre-sales technical activities, including solution demos and proposals ✅ Conduct workshops, training, and knowledge transfer sessions for clients Skills & Qualifications: 1–3 years of experience in a technical consulting, software engineering, or solution implementation role Strong programming/scripting skills (e.g., Python, Java, Golang, or relevant to Hashmato’s stack) Familiarity with APIs, cloud platforms (AWS, Azure, GCP), and integration technologies Good understanding of system architecture, databases (SQL/NoSQL), and security best practices Strong problem-solving skills and the ability to work independently and as part of a team Excellent communication and client-facing abilities Preferred: Prior experience with Hashmato products, SaaS platforms, or similar domains Experience in the POS (Point of Sale) industry is a strong advantage Show more Show less

Posted 7 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job title: MIS SAP Job Location: Pune Candidate Specification & Job Description Candidate must have 2+years of experience as MIS SAP Develop and implement SAP scripts and automation solutions using tools like SAP GUI Scripting, SAP VBA, and SAP BDC Analyze business requirements and identify opportunities for automation and process improvement Design, develop, and test SAP scripts and automation solutions to meet business needs Collaborate with cross-functional teams to identify and prioritize automation projects Troubleshoot and resolve issues with SAP scripts and automation solutions Develop and maintain technical documentation for SAP scripts and automation solutions Provide training and support to end-users on SAP scripts and automation solutions Experience with SAP ERP systems (e.g., SAP ECC, SAP S/4HANA) Strong analytical and problem-solving skills Excellent communication and collaboration skills Ability to work in a fast-paced environment and meet deadlines Must be flexible to work in shifts Skills Required RoleSenior associate -MIS SAP - Pune Industry TypeITES/BPO/KPO Functional Area Required Education B.Com Employment TypeFull Time, Permanent Key Skills MIS SAP POWER BI SAP GUI Other Information Job CodeGO/JC/386/2025 Recruiter NameMarilakshmi S Show more Show less

Posted 7 hours ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Role name: Automation Test Lead Years of exp : 5 - 8 yrs About Dailoqa Dailoqa’s mission is to bridge human expertise and artificial intelligence to solve the challenges facing financial services. Our founding team of 20+ international leaders, including former CIOs and senior industry experts, combines extensive technical expertise with decades of real-world experience to create tailored solutions that harness the power of combined intelligence. With a focus on Financial Services clients, we have deep expertise across Risk & Regulations, Retail & Institutional Banking, Capital Markets, and Wealth & Asset Management. Dailoqa has global reach in UK, Europe, Africa, India, ASEAN, and Australia. We integrate AI into business strategies to deliver tangible outcomes and set new standards for the financial services industry. Working at Dailoqa will be hard work, our environment is fluid and fast-moving and you'll be part of a community that values innovation, collaboration, and relentless curiosity. We’re looking at people who: Are proactive, curious adaptable, and patient Shape the company's vision and will have a direct impact on its success. Have the opportunity for fast career growth. Have the opportunity to participate in the upside of an ultra-growth venture. Have fun 🙂 Don’t apply if: You want to work on a single layer of the application. You prefer to work on well-defined problems. You need clear, pre-defined processes. You prefer a relaxed and slow paced environment. Role Overview As an Automation Test Lead at Dailoqa, you’ll architect and implement robust testing frameworks for both software and AI/ML systems. You’ll bridge the gap between traditional QA and AI-specific validation, ensuring seamless integration of automated testing into CI/CD pipelines while addressing unique challenges like model accuracy, GenAI output validation, and ethical AI compliance. Key Responsibilities Test Automation Strategy & Framework Design Design and implement scalable test automation frameworks for frontend (UI/UX), backend APIs, and AI/ML model-serving endpoints using tools like Selenium, Playwright, Postman, or custom Python/Java solutions. Build GenAI-specific test suites for validating prompt outputs, LLM-based chat interfaces, RAG systems, and vector search accuracy. Develop performance testing strategies for AI pipelines (e.g., model inference latency, resource utilization). Continuous Testing & CI/CD Integration Establish and maintain continuous testing pipelines integrated with GitHub Actions, Jenkins, or GitLab CI/CD. Implement shift-left testing by embedding automated checks into development workflows (e.g., unit tests, contract testing). AI/ML Model Validation Collaborate with data scientists to test AI/ML models for accuracy, fairness, stability, and bias mitigation using tools like TensorFlow Model Analysis or MLflow. Validate model drift and retraining pipelines to ensure consistent performance in production. Quality Metrics & Reporting Define and track KPIs: Test coverage (code, data, scenarios) Defect leakage rate Automation ROI (time saved vs. maintenance effort) Model accuracy thresholds Report risks and quality trends to stakeholders in sprint reviews. Drive adoption of AI-specific testing tools (e.g., LangChain for LLM testing, Great Expectations for data validation). Technical Requirements Must-Have 5–8 years in test automation, with 2+ years validating AI/ML systems. Expertise in: Automation tools: Selenium, Playwright, Cypress, REST Assured, Locust/JMeter CI/CD: Jenkins, GitHub Actions, GitLab AI/ML testing: Model validation, drift detection, GenAI output evaluation Languages: Python, Java, or JavaScript Certifications: ISTQB Advanced, CAST, or equivalent. Experience with MLOps tools: MLflow, Kubeflow, TFX Familiarity with vector databases (Pinecone, Milvus) and RAG workflows. Strong programming/scripting experience in JavaScript, Python, Java, or similar Experience with API testing, UI testing, and automated pipelines Understanding of AI/ML model testing, output evaluation, and non-deterministic behavior validation Experience with testing AI chatbots, LLM responses, prompt engineering outcomes, or AI fairness/bias Familiarity with MLOps pipelines and automated validation of model performance in production Exposure to Agile/Scrum methodology and tools like Azure Boards Soft Skills Strong problem-solving skills for balancing speed and quality in fast-paced AI development. Ability to communicate technical risks to non-technical stakeholders. Collaborative mindset to work with cross-functional teams (data scientists, ML engineers, DevOps). Show more Show less

Posted 7 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 7 hours ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: PySpark Data Engineer Location: Pune (Hybrid) Contract 6 to 11 months Job Description: We are looking for a skilled PySpark Data Engineer with 2 to 4 years of experience. The ideal candidate should have strong expertise in building and optimizing data pipelines using PySpark and should have experience working on cloud platforms like Azure or AWS. Mandatory skills needed: Strong expertise on Pyspark Experience of Pyspark on Azure or AWS Good understanding of Informatica PC Good understanding of Data Warehousing concepts and SQL Excellent analytical skills, troubleshooting skills Excellent verbal and written communication skills Good to have skills: Basic Unix commands, and Shell scripting Version control tools like Git DevOps experience Show more Show less

Posted 7 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description: We are seeking a highly driven and skilled Quality Assurance Engineer to promote testing throughout the development cycle and ensure the delivery of high-quality, customer-focused solutions. The ideal candidate will have strong expertise in testing containerized, cloud-hosted/on-prem back-end systems, including a variety of protocols such as DNS, LDAP, Diameter, Rest API, Radius, and NTP. The candidate will be responsible for driving automation engineering, collaborating with cross-functional teams, and continuously evolving testing practices. This role also involves providing client team support, assisting in incident investigations, and ensuring adherence to testing best practices and standards. The successful candidate will contribute to the evolution of automation engineering, develop and maintain test automation frameworks, and apply their skills in testing large-scale, network-based applications in an Agile environment. Key Responsibilities: Promote testing at all stages of the development cycle to ensure high-quality outcomes. Interpret business requirements and convert them into technical actions for high-quality testing solutions. Assist squads with quality assurance activities, including requirements analysis, exploratory testing, and test automation. Demonstrate excellent skills in scripting, design, analysis, and reporting. Perform detailed results analysis, including application monitoring and third-party application evaluations. Provide support for client teams, participate in incident investigations, and support release activities. Apply strong scripting skills (e.g., Python) to automate tasks. Utilize hands-on experience with network/cmd tools. Proactively contribute to the evolution of automation engineering practices. Continuously develop new skills and stay updated on emerging technologies. Apply a great analytical mindset and ask the right questions to ensure quality. Be creative and action-oriented to solve complex testing challenges. Ensure adherence to test best practices, continual improvements, and relevant standards. Qualifications: Bachelor’s or Master’s degree in Computer Science, Software Engineering. Strong knowledge and experience in testing containerized, cloud-hosted/on-prem back-end systems with various protocols (DNS, LDAP, Diameter, Rest API, Radius, NTP). Experience with testing elements running within a Continuous Deployment (CD) pipeline. Strong expertise in automating backend API testing. Proven experience working within an Agile team. Deep knowledge of Linux. OpenStack knowledge is preferred. Experience creating and maintaining test automation frameworks and identifying tests for automation using BDD. Proven ability to build a test automation framework from scratch. Experience testing a wide variety of protocols and services beyond just HTTP. Experience testing large-scale applications, preferably network-based. Expertise in automating tasks using tools such as Jenkins and scripting languages like Scala, Python, or Shell. Familiarity with Wireshark, PCAP, Splunk, Grafana, and ELK. Knowledge of Big IP, F5, and load balancing. Experience testing software in an Agile methodology with CI/CD, using build tools like Jenkins and GitHub. Hands-on experience with AWS, Cloud, and Microservices. Strong communication skills, able to share test findings at all levels. Experience as a performance engineer with deep technical skills and understanding. Show more Show less

Posted 7 hours ago

Apply

4.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Location: Sultanpur, Delhi Type: Full-Time Experience: 2–4 years About the company: Here we celebrate craftsmanship and design-led storytelling. Rooted in Indian heritage with a modern global outlook, our brand spans artisanal leather goods and lifestyle products. We are looking for a Videographer to join our in-house creative team—someone who can bring our stories to life through cinematic, engaging motion content that reflects the essence of the company. What You’ll Do: Conceptualize, plan, and execute brand videos, product films, behind-the-scenes footage, social media reels, and campaign stories Collaborate with the design, marketing, and e-commerce teams to produce high-quality video content for web, retail, social, and editorial use Own the entire video production process—from scripting and storyboarding to shooting, lighting, editing, and sound design Shoot both studio and on-location content, ensuring a consistent, premium visual style aligned with our brand aesthetics Edit videos with a strong sense of pacing, rhythm, and visual storytelling across formats (short-form, vertical, landscape) Manage and maintain video equipment, file organization, and content archives Stay current on videography trends, tools, and techniques within the fashion and lifestyle industry What We’re Looking For: 2–4 years of professional videography experience in fashion, product, or lifestyle content A showreel/portfolio that demonstrates strong storytelling, aesthetic finesse, and editing skills Proficiency in Adobe Premiere Pro, After Effects, and other relevant post-production tools Solid grasp of studio and natural lighting techniques, composition, camera operation, and audio basics Ability to collaborate cross-functionally, take direction, and execute under tight deadlines Highly organized, self-driven, and capable of managing multiple projects at once Bonus Points If You: Have experience filming luxury or heritage fashion brands Are complete understanding of social media, trends and audios Can contribute to creative ideation, scripting, or art direction Have a flair for set styling and understand visual branding nuances Show more Show less

Posted 7 hours ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 7 hours ago

Apply

1.0 - 3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Title : Zoho Developer Location: Ahmedabad, Gujarat, India Company: Xcellhost Cloud Technologies About Xcellhost Cloud Technologies: Xcellhost Cloud Technologies is a leading provider of cloud solutions, IT infrastructure, and enterprise technology services. We specialize in delivering innovative cloud services, managed IT solutions, and IT consulting, helping businesses transform digitally. With a strong presence across India and globally, we are committed to providing cutting-edge solutions that empower businesses to scale and succeed. Position Overview: Xcellhost is looking for a Zoho Developer to join our dynamic team based in Ahmedabad. As a Zoho Developer, you will be responsible for the customization, development, and maintenance of Zoho applications (Zoho CRM, Zoho Creator, Zoho Books, etc.). The ideal candidate will have a deep understanding of Zoho’s platform, a problem-solving mindset, and a passion for building high-quality software solutions that meet business requirements. Key Responsibilities: Zoho Platform Customization & Development: Design, customize, and implement Zoho applications, including Zoho CRM, Zoho Creator, Zoho Books, Zoho Projects, etc., to meet specific business needs. Integration Development: Integrate Zoho apps with third-party systems (ERP, payment gateways, marketing tools, etc.) using APIs, webhooks, and other integration tools. Automation & Workflow Design: Develop and implement automation features within Zoho to streamline business processes, improve productivity, and enhance user experience. Troubleshooting & Support: Provide technical support, debug issues, and optimize existing Zoho applications to ensure smooth functionality. Documentation & Testing: Write detailed technical documentation and conduct regular testing of developed applications to ensure their robustness and reliability. Collaboration & Consulting: Work closely with business teams to understand requirements and provide Zoho-based solutions tailored to those needs. Act as a consultant to clients on best practices and optimal use of Zoho tools. Stay Updated: Continuously improve skills and stay up-to-date with the latest updates, features, and capabilities in the Zoho ecosystem. Required Skills & Qualifications: Proven experience in Zoho CRM customization and Zoho Creator application development. Solid knowledge of Zoho’s API, webhooks, and integrations with third-party applications. Strong proficiency in deluge scripting, Zoho’s built-in scripting language. Experience in building and automating workflows, reports, and dashboards within the Zoho environment. Familiarity with Zoho Books, Zoho Projects, and other Zoho applications is a plus. Understanding of web technologies like HTML, CSS, JavaScript, and REST APIs. Ability to troubleshoot and debug issues, ensuring quality, scalability, and performance. Excellent communication and collaboration skills, with the ability to work in a team and interact with clients. Preferred Skills: Experience with Zoho Analytics and advanced data reporting. Understanding of cloud-based solutions and other SaaS tools. Prior experience in providing technical consulting on Zoho solutions. Education & Experience: Bachelor’s degree in Computer Science, Information Technology, or a related field. Minimum 1-3 years of experience working with the Zoho platform. Why Xcellhost? Work with a leading technology provider specializing in cloud-based IT solutions. Opportunity to be part of a fast-growing company with a collaborative work culture. Competitive salary and benefits package. Career growth and skill development opportunities in the cloud computing industry. Flexible work environment and the chance to work on cutting-edge technology. How to Apply: Interested candidates can send their updated resumes to jayp@xcellhost.cloud . Please mention "Zoho Developer" in the subject line. Show more Show less

Posted 7 hours ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Title: Senior Data Engineer (AWS Expert) Location: Ahmedabad Experience: 5+ Years Company: IGNEK Shift Time: 2 PM - 11 PM IST About IGNEK: IGNEK is a fast-growing custom software development company with over a decade of industry experience and a passionate team of 25+ experts. We specialize in crafting end-to-end digital solutions that empower businesses to scale efficiently and stay ahead in an ever-evolving digital world. At IGNEK, we believe in quality, innovation, and a people-first approach to solving real-world challenges through technology. We are looking for a highly skilled and experienced Data Engineer with deep expertise in AWS cloud technologies and strong hands-on experience in backend development, data pipelines, and system design. The ideal candidate will take ownership of delivering robust and scalable solutions while collaborating closely with cross-functional teams and the tech lead. Key Responsibilities: ● Lead and manage the end-to-end implementation of cloud-native data solutions on AWS. ● Design, build, and maintain scalable data pipelines (PySpark/Spark) and data lake architectures (Delta Lake 3.0 or similar). ● Migrate on-premises systems to modern, scalable AWS-based services. hr@ignek.com +91-9328495160 www.ignek.com ● Engineer robust relational databases using Postgres or Oracle with a strong understanding of procedural languages. ● Collaborate with the tech lead to understand business requirements and deliver practical, scalable solutions. ● Integrate newly developed features following defined SDLC standards using CI/CD pipelines. ● Develop orchestration and automation workflows using tools like Apache Airflow. ● Ensure all solutions comply with security best practices, performance benchmarks, and cloud architecture standards. ● Monitor, debug, and troubleshoot issues across multiple environments. ● Stay current with new AWS features, services, and trends to drive continuous platform improvement. Required Skills and Experience: ● 5+ years of professional experience in data engineering and backend development. ● Strong expertise in Python, Scala, and PySpark. ● Deep knowledge of AWS services: EC2, S3, Lambda, RDS, Kinesis, IAM, API Gateway, and others. hr@ignek.com +91-9328495160 www.ignek.com ● Hands-on experience with Postgres or Oracle, and building relational data stores. ● Experience with Spark clusters, Delta Lake, Glue Catalogue, and large-scale data processing. ● Proven track record of end-to-end project delivery and third-party system integrations. ● Solid understanding of microservices, serverless architectures, and distributed computing. ● Skilled in Java, Bash scripting, and search tools like Elasticsearch. ● Proficient in using CI/CD tools (e.g., GitLab, GitHub, AWS CodePipeline). ● Experience working with Infrastructure as Code (Iac) using Terraform. ● Hands-on experience with Docker, containerization, and cloud-native deployments. Preferred Qualifications: ● AWS Certifications (e.g., AWS Certified Solutions Architect or similar). ● Exposure to Agile/Scrum project methodologies. ● Familiarity with Kubernetes, advanced networking, and cloud security practices. ● Experience managing or collaborating with onshore/offshore teams. hr@ignek.com +91-9328495160 www.ignek.com Soft Skills: ● Excellent communication and stakeholder management. ● Strong leadership and problem-solving abilities. ● Team player with a collaborative mindset. ● High ownership and accountability in delivering quality outcomes. Why Join IGNEK? ● Work on exciting, large-scale digital transformation projects. ● Be part of a people-centric, innovation-driven culture. ● A flexible work environment and opportunities for continuous learning. How to Apply: Please send your resume and a cover letter detailing your experience to hr@ignek.com Show more Show less

Posted 7 hours ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Our technology services client is seeking multiple QA Manager to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Role: QA Manager Mandatory Skills: Automation, BDD, Cucumber, Selenium, Web driver, Rest Assured, Shell Scripting Experience : 12+ Years Location : Bangalore Notice Period : 15 Days or Less Job Description : Testing Certification (e.g. ISTQB, etc.) Knowledge of Programming/Scripting: Java, Shell Automation Frameworks (Java Based): Cucumber BDD, Selenium WebDriver, Rest-Assured, Serenity BDD Performance Testing Tools: JMeter Data visualization and monitoring tool: Grafana Test Management & Defect Tracking Tools like JIRA, HP ALM Understanding of SDLC & STLC including Agile Scrum CI/CD Tools: Jenkins, GitLab CI Database: SQL Testing Certification: ISTQB Analytical & Problem-Solving Skills Strong analytical thinking to identify root causes of issues Ability to interpret complex requirements and translate them into test cases Risk analysis and prioritization of testing efforts Leadership & Management Skills Team management and mentoring Resource planning and task delegation Conflict resolution and motivation Performance evaluation and feedback Communication & Collaboration Skills Clear verbal and written communication Ability to collaborate with cross-functional teams Stakeholder management and reporting Client interaction and expectation management Process-Oriented Skills Familiarity with Agile, Scrum, or DevOps methodologies Process improvement and QA best practices Documentation and compliance with standards (e.g., ISO, CMMI) If you are interested, share the updated resume to rajesh.s@s3staff.com Show more Show less

Posted 7 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

🎯 We’re Hiring | Join Our Engineering Dream Team – India 🚀 Looking to shape the future of automotive, industrial, and semiconductor innovation? We're growing and hiring across multiple technical domains! Explore high-impact roles with global collaboration opportunities. 🌍 🔋 AMS Design Lead | Hyderabad • Lead next-gen automotive-grade PMIC design • Own Switcher IPs (DC-DC Converters) for global programs • 9+ yrs in analog design – references, amplifiers, loop compensation • Mentor teams & collaborate across international centers • Drive power-efficient & precision-focused innovations ✅ AMS Verification Lead | Hyderabad • 9+ yrs in Verilog-AMS, WREAL, UVM , AMS simulation flows • Build verification environments from scratch • Own sign-off strategies & mentor verification engineers • Expertise in co-simulation & mixed-signal modeling ⚙️ Embedded Software Applications Engineer | Pune • 5+ yrs hands-on embedded SW experience • Motor control expertise – FOC, Sensorless , C/C++, Cortex-M • Experience with full SW lifecycle (ASPICE L2) , debugging, protocols (SPI, I2C, UART) • Work with tools like IAR, GitLab, Oscilloscopes • Collaborate with global teams & travel opportunities 🛠️ Embedded Software Engineer – V&V | Hyderabad • 5+ yrs in Embedded SW V&V – VectorCAST, ASPICE/V-Model • C programming, MCU-based systems (ARM/STM/PIC), UART, CAN, SPI, I2C • Firmware integration, board bring-up & debugging • Familiar with Git, Keil, IAR • Bonus: C++, shell scripting, hardware interface 💾 Senior Physical Design Expert & Lead | Hyderabad • Hands-on Netlist2GDSII flow on advanced nodes (16nm & below) • Floor planning, power grid, CTS, STA , and physical verification • Tools: Cadence Innovus, Synopsys ICC2 • Strong in SoC integration & Tcl/Tk/Perl scripting • Proven leadership in physical design projects 🧪 Senior PSV Engineer & Lead – Hyderabad/Noida • Post Silicon Validation of Analog Mixed Signal IPs/SoCs • Strong analog/digital fundamentals • Experience with tools: Oscilloscope, NI-PXI, Spectrum Analyzer • Python & LabVIEW scripting for automation • Exposure to current sensor validation is a plus Interested? Apply or know someone great? Reach out via DM or WhatsApp +91 9966034636 / Send your profile to ranjith.allam@cyient.com Show more Show less

Posted 7 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: MLOps Engineer Urgent — High Priority requirement. 1. Location - Hyderabad/ Pune 2. Interview Rounds - 4 round. 3. Contract - 12 Months About Client: We are a fast-growing boutique data engineering firm that empowers enterprises to manage and harness their data landscape efficiently. Leveraging advanced machine learning (ML) methodologies, Job Overview: We are seeking a highly skilled and motivated MLOps Engineer with 3–5 years of experience to join our engineering team. The ideal candidate should possess a strong foundation in DevOps or software engineering principles with practical exposure to machine learning operational workflows. You will be instrumental in operationalizing ML systems, optimizing the deployment lifecycle, and strengthening the integration between data science and engineering teams. Required Skills: ● Hands-on experience with MLOps platforms such as MLflow and Kubeflow. ● Proficiency in Infrastructure as Code (IaC) tools like Terraform or Ansible. ● Strong familiarity with monitoring and alerting frameworks (Prometheus, Grafana, Datadog, AWS CloudWatch). ● Solid understanding of microservices architecture, service discovery, and load balancing. ● Excellent programming skills in Python, with experience in writing modular, testable, and maintainable code. ● Proficient in Docker and container-based application deployments. ● Experience with CI/CD tools such as Jenkins or GitLab CI. ● Basic working knowledge of Kubernetes for container orchestration. ● Practical experience with cloud-based ML platforms such as AWS SageMaker, Databricks, or Google Vertex AI. ● Competency in Linux shell scripting and command-line operations. ● Proficiency with Git and version control best practices. ● Foundational knowledge of machine learning principles and typical ML workflow patterns. Good-to-Have Skills: ● Awareness of security practices specific to ML pipelines, including secure model endpoints and data protection. ● Experience with scripting languages like Bash or PowerShell for automation tasks. ● Exposure to database scripting and data integration pipelines. Experience & Qualifications: ● 3–5+ years of experience in MLOps, Site Reliability Engineering (SRE), or Software Engineering roles. ● At least 2+ years of hands-on experience working on ML/AI systems in production settings. ● Deep understanding of cloud-native architectures, containerization, and the end-to-end ML lifecycle. ● Bachelor’s degree in Computer Science, Software Engineering, or a related technical field. ● Relevant certifications such as AWS Certified DevOps Engineer – Professional are a strong plus. Show more Show less

Posted 7 hours ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits At Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking an experienced Data Engineer who is having strong problem solving and analytical skills, high attention to detail, passion for analytics, real-time data, and monitoring and critical Thinking and collaboration skills. The candidate should be a self-starter and a quick learner, ready to learn new technologies and tools that the job demands. Responsibilities Building automated pipelines and solutions for data migration/data import or other operations requiring data ETL. Performing analysis on core products to support migration planning and development. Working closely with the Team Lead and collaborating with other stakeholders to gather requirements and build well architected data solutions. Produce supporting documentation, such as specifications, data models, relation between data and others, required for the effective development, usage and communication of the data operations solutions with different stakeholders. Competencies, Characteristics And Traits Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum of 2 years in ETL/ELT Pipelines. Experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Experience working with API sources and destinations. Essential Skills and Experience Strong experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc Strong knowledge of databases, data modeling and data life cycle Proficient in understanding data and writing complex SQL Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum 2 years in ETL/ELT Pipelines Experience working with REST API in data pipelines Strong problem solving and high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking, good communication and collaboration skills Focus on high performance and quality delivery Highly self-motivated and continuous learner Desirable Experience working with no-SQL databases like MongoDB Experience with Snaplogic administration is preferable Experience working with Microsoft Power Platform (PowerAutomate and PowerApps) or any similar automation / RPA tool Experience with cloud data platforms like snowflake, data bricks, AWS, Azure etc Awareness of emerging ETL and Cloud concepts such as Amazon AWS or Microsoft Azure Experience working with Scripting languages, such as Python, R, JavaScript, etc. About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place! Show more Show less

Posted 7 hours ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role: Senior Network Engineer - Tier 4 Work mode: Hybrid Job Summary: We are seeking a highly skilled and experienced Senior Network Engineer (Tier 4) with deep expertise in Juniper routing and switching , Fortinet firewall configuration and management , and enterprise network architecture . This role is critical in designing, implementing, and supporting complex network infrastructures for large-scale enterprise environments. Key Responsibilities: Lead the design, deployment, and optimization of enterprise network solutions using Juniper and Fortinet technologies. Serve as the highest-level escalation point for complex network issues (Tier 4 support). Architect and implement secure, scalable, and resilient network infrastructures. Configure and manage Fortinet firewalls (FortiGate, FortiManager, FortiAnalyzer). Design and maintain Juniper-based routing and switching environments (MX, EX, QFX series). Collaborate with cross-functional teams to align network strategies with business goals. Conduct network assessments, performance tuning, and capacity planning. Develop and maintain detailed network documentation, diagrams, and SOPs. Mentor junior engineers and provide technical leadership across projects. Stay current with emerging technologies and recommend improvements. Required Qualifications: Certifications: JNCIA-Junos (Juniper Networks Certified Associate) NSE 4 (Fortinet Network Security Expert Level 4) Technical Expertise: Advanced knowledge of Juniper routing and switching (OSPF, BGP, MPLS, VXLAN, EVPN). Expert-level experience with Fortinet firewall configuration, policies, VPNs, and UTM features. Strong understanding of enterprise network design principles and best practices. Proficiency in network monitoring, troubleshooting, and performance analysis tools. Familiarity with automation and scripting (Python, Ansible) is a plus. Experience: 8+ years of hands-on experience in network engineering roles. Proven track record in designing and supporting large-scale enterprise networks. Experience in high-availability and disaster recovery network planning. Preferred Skills: Additional Juniper certifications (e.g., JNCIS, JNCIP, JNCIE). Experience with SD-WAN, cloud networking (AWS, Azure), and NAC solutions. Knowledge of ITIL processes and change management. Show more Show less

Posted 7 hours ago

Apply

4.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role**: Automation Tester Required Technical Skill Set**: C# testing, unit testing (selenium), Azure Cloud and DevOps Desired Experience Range**: 6 to 9 yrs Location of Requirement**: Hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have** (Ideally should not be more than 4-6) C# testing, unit testing (selenium), Azure Cloud and DevOps Good-to-Have Azure DevOps for CI/CD pipelines, managing builds, and automated deployments. Responsibility of / Expectations from the Role Design and develop robust automation frameworks and tests using C#, preferably with the MSTest Framework or NUnit. Create and maintain Bicep templates for deploying and managing Azure resources. Work with Windows and Linux environments to build, deploy, and troubleshoot automation solutions. Debug, troubleshoot, and resolve issues in automation, test cases, and infrastructure. 4-8 years of experience in software development and automation testing, with a primary focus on C#. Proficiency in MSTest Framework (preferred) or NUnit for creating and managing backend test cases. Strong experience with Azure cloud services and resource management. Hands-on experience with Bicep for Azure resource deployment and management. Knowledge of scripting in Bash or PowerShell to create automation solutions. Show more Show less

Posted 7 hours ago

Apply

6.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Azure Data Engineer (6 Years Experience) Location: Remote Employment Type: Full-time Experience Required: 6 years Job Summary: We are seeking an experienced Azure Data Engineer to join our data team. The ideal candidate will have strong expertise in designing and implementing scalable data solutions on Microsoft Azure, with a solid foundation in data integration, data warehousing, and ETL processes. Key Responsibilities: Design, build, and manage scalable data pipelines and data integration solutions in Azure Develop and optimize data lake and data warehouse solutions using Azure Data Lake, Azure Synapse, and Azure SQL Create ETL/ELT processes using Azure Data Factory Implement data modeling and data architecture best practices Collaborate with data analysts, data scientists, and business stakeholders to deliver reliable and efficient data solutions Monitor and troubleshoot data pipelines for performance and reliability Ensure data security, privacy, and compliance with organizational standards Automate data workflows and optimize performance using cloud-native tools Required Skills: 6 years of experience in data engineering roles Strong experience with Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, and Azure SQL Proficiency in SQL, T-SQL, and scripting languages (Python or PowerShell) Hands-on experience with data ingestion, transformation, and orchestration Good understanding of data modeling (dimensional, star/snowflake schema) Experience with version control (Git) and CI/CD in data projects Familiarity with Azure DevOps and monitoring tools Preferred Skills: Experience with Databricks or Spark on Azure Knowledge of data governance and metadata management tools Understanding of big data technologies and architecture Microsoft Certified: Azure Data Engineer Associate (preferred) Show more Show less

Posted 7 hours ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Job Description: We are looking for a DevOps Engineer to join our team and help automate, manage, and streamline our development and deployment processes. The ideal candidate will have experience with cloud platforms, CI/CD pipelines, and infrastructure as code (IaC). Key Responsibilities: Design, build, and maintain efficient and reliable CI/CD pipelines. Automate infrastructure provisioning using tools like Terraform or CloudFormation. Monitor system performance and troubleshoot issues in development and production environments. Collaborate with development, QA, and operations teams to ensure smooth releases. Implement security best practices across cloud and on-premise environments. Manage containerized applications using Docker and orchestration tools like Kubernetes. Required Skills: Experience with cloud platforms (AWS, Azure, or GCP). Hands-on with CI/CD tools such as Jenkins, GitLab CI, CircleCI, or Azure DevOps. Proficiency in containerization tools like Docker and orchestration using Kubernetes. Experience with infrastructure-as-code tools (Terraform, Ansible, Chef, or Puppet). Familiarity with monitoring tools (Prometheus, Grafana, ELK Stack, etc.). Strong knowledge of Linux systems , scripting (Bash, Python), and Git. Preferred Qualifications: Experience with microservices architecture. Familiarity with Agile/Scrum methodologies. Knowledge of networking and security fundamentals. Relevant certifications (AWS Certified DevOps Engineer, CKA, etc.) are a plus. Show more Show less

Posted 7 hours ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 7 hours ago

Apply

4.0 - 7.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Linkedin logo

Area(s) of responsibility 4-7 years’ experience in PTC Windchill and Thingworx Customization & Configuration. Experienced in: Solution Design, Windchill Customization Debugging, Windchill Development Fundamentals, Documentation, Software Testing, Software Maintenance, Software Performance Tuning, Strong product development methodology and tools experience including agile methods, source management, problem resolution, automated testing, DevOps, CICD, GITHUB, SVN etc. Technical competences: (Required) Windchill Application Skilled in basic and advanced Java, Webservices, JavaScript, Shell scripting, SQL, HTML, CSS. Knowledge of Windchill implementation in basic modules is must Very skilled in PTC Windchill - PDM Link customization, XML, Database(SQL) programming In depth knowledge and good experience in JAVA, J2EE, JSP, Java Script Good understanding of basic PLM processes like BOM Management, Part Management, Document Management, EBOM, MBOM. Basic knowledge of UML, Unix – administration Have a strong business focus and is dedicated to meeting the expectations and requirements of the business Ability to translate and balance functional and non-functional business requirements into solutions, i.e Work with customers to translate high-level business requirements into detailed functional specifications, and manage changes to the specifications to support impacted business functions and systems Good communication & presentation skills are a requirement Show more Show less

Posted 7 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Linkedin logo

Company overview: TraceLink’s software solutions and Opus Platform help the pharmaceutical industry digitize their supply chain and enable greater compliance, visibility, and decision making. It reduces disruption to the supply of medicines to patients who need them, anywhere in the world. Founded in 2009 with the simple mission of protecting patients, today Tracelink has 8 offices, over 800 employees and more than 1300 customers in over 60 countries around the world. Our expanding product suite continues to protect patients and now also enhances multi-enterprise collaboration through innovative new applications such as MINT. Tracelink is recognized as an industry leader by Gartner and IDC, and for having a great company culture by Comparably. SE-II About Tracelink: TraceLink’s software solutions and Opus Platform help the pharmaceutical industry digitize its supply chain and enable greater compliance, visibility, and decision-making. It reduces disruption to the supply of medicines to patients who need them, anywhere in the world. Founded in 2009 with the simple mission of protecting patients, today Tracelink has 8 offices, over 800 employees, and more than 1300 customers in over 60 countries around the world. Tracelink is recognized as an industry leader by Gartner and IDC, and for having a great company culture by Comparably. Responsibilities: Collaborate with cross-functional teams to design, develop, and implement software solutions Write clean, maintainable, and efficient code using Java, JavaRx, and Javascript Deploy and manage applications on Kubernetes and cloud platforms (AWS) Participate in code reviews to ensure code quality and adherence to coding standards Troubleshoot and debug software applications to resolve issues promptly Collaborate with product owners to understand requirements and deliver solutions that meet business needs Stay up-to-date with emerging technologies and industry trends to ensure our software remains cutting-edge Design, implement, and maintain high availability solutions. Contribute to the continual improvement of our Architecture. Analyze and resolve customer-reported problems escalated to engineering for detailed analysis Create patches to resolve customer-reported problems Work with the QA team to fix and verify defects as part of Tracelink’s patch and road map releases Adhere to Tracelink’s documented software dev life cycle methodology Work with customer support Qualifications and Skillsets: Candidates must possess the following skills and traits: Bachelor's degree in Computer Engineering or equivalent 3-5 years of professional experience as a Software Engineer Strong proficiency in Java, Rx Java, and Javascript Experience with container orchestration using Kubernetes Analyze, develop, as well as implement RESTful services and APIs Familiarity with Cloud platforms, particularly AWS Proficient with frontend technologies, including basic JS Familiarity with DevOps practices and CI/CD pipelines. Understanding of software development principles and best practices. Experience working in an Agile/Scrum-inspired delivery methodology. Solid troubleshooting and debugging skills Strong communication and collaboration skills Ability to work in a fast-paced and dynamic environment Ability to work effectively in a team environment and independently when required. Work closely with our QA team; assist with test planning as appropriate Work closely with product managers and stakeholders to understand requirements and translate them into technical specifications. Stay updated with industry trends and best practices in software development. Helpful skills and experience: Familiarity with the concepts involved in running cloud-based applications on platforms such as Amazon Web Services. Experience with the pharmaceutical industry. Familiarity with tools like Jenkins, GitLab CI/CD, Docker, or Kubernetes Proficiency in Git and related branching strategies. Strong problem-solving abilities, excellent communication skills, and a collaborative mindset. Experience with scripting languages like Javascript, Python, or Bash for automation tasks. Understanding of GraphQL Experience with microservices architecture Understanding of software development best practices and design patterns Please see the Tracelink Privacy Policy for more information on how Tracelink processes your personal information during the recruitment process and, if applicable based on your location, how you can exercise your privacy rights. If you have questions about this privacy notice or need to contact us in connection with your personal data, including any requests to exercise your legal rights referred to at the end of this notice, please contact Candidate-Privacy@tracelink.com. Show more Show less

Posted 7 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

This is an incredible opportunity to be part of a company that has been at the forefront of AI and high-performance data storage innovation for over two decades. DataDirect Networks (DDN) is a global market leader renowned for powering many of the world's most demanding AI data centers, in industries ranging from life sciences and healthcare to financial services, autonomous cars, Government, academia, research and manufacturing. "DDN's A3I solutions are transforming the landscape of AI infrastructure." – IDC “The real differentiator is DDN. I never hesitate to recommend DDN. DDN is the de facto name for AI Storage in high performance environments” - Marc Hamilton, VP, Solutions Architecture & Engineering | NVIDIA DDN is the global leader in AI and multi-cloud data management at scale. Our cutting-edge data intelligence platform is designed to accelerate AI workloads, enabling organizations to extract maximum value from their data. With a proven track record of performance, reliability, and scalability, DDN empowers businesses to tackle the most challenging AI and data-intensive workloads with confidence. Our success is driven by our unwavering commitment to innovation, customer-centricity, and a team of passionate professionals who bring their expertise and dedication to every project. This is a chance to make a significant impact at a company that is shaping the future of AI and data management. Our commitment to innovation, customer success, and market leadership makes this an exciting and rewarding role for a driven professional looking to make a lasting impact in the world of AI and data storage. About the Role You will lead the design and implementation of scalable, secure, and highly available infrastructure across both cloud and on-premise environments. This role demands a deep understanding of Linux systems, infrastructure automation, and performance tuning, especially in high-performance computing (HPC) setups. As a technical leader, you’ll collaborate closely with development, QA, and operations teams to drive DevOps best practices, tool adoption, and overall infrastructure reliability. Key Responsibilities: • Design, build, and maintain Linux-based infrastructure across cloud (primarily AWS) and physical data centers. • Implement and manage Infrastructure as Code (IaC) using tools such as CloudFormation, Terraform, Ansible, and Chef. • Develop and manage CI/CD pipelines using Jenkins, Git, and Gerrit to support continuous delivery. • Automate provisioning, configuration, and software deployments with Bash, Python, Ansible, etc. • Set up and manage monitoring/logging systems like Prometheus, Grafana, and ELK stack. • Optimize system performance and troubleshoot critical infrastructure issues related to networking, filesystems, and services. • Configure and maintain storage and filesystems including ext4, xfs, LVM, NFS, iSCSI, and potentially Lustre. • Manage PXE boot infrastructure using Cobbler/Kickstart, and create/maintain custom ISO images. • Implement infrastructure security best practices, including IAM, encryption, and firewall policies. • Act as a DevOps thought leader, mentor junior engineers, and recommend tooling and process improvements. • Maintain clear and concise documentation of systems, processes, and best practices. Collaborate with cross-functional teams to ensure reliable and scalable application delivery. Required Skills & Experience • 5+ years of experience in DevOps, SRE, or Infrastructure Engineering. • Deep expertise in Linux system administration, especially around storage, networking, and process control. • Strong proficiency in scripting (e.g., Bash, Python) and configuration management tools (Chef, Ansible). • Proven experience in managing on-premise data center infrastructure, including provisioning and PXE boot tools. • Familiar with CI/CD systems, Agile workflows, and Git-based source control (Gerrit/GitHub). • Experience with cloud services, preferably AWS, and hybrid cloud models. • Knowledge of virtualization (e.g., KVM, Vagrant) and containerization (Docker, Podman, Kubernetes). • Excellent communication, collaboration, and documentation skills Nice to Have • Hands-on with Lustre or other distributed/parallel filesystems. • Experience in HPC (High-Performance Computing) environments. • Familiarity with Kubernetes deployments in hybrid clusters Show more Show less

Posted 8 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies