Inorg is a technology company specializing in innovative software solutions for business operations.
Pune
INR Not disclosed
Work from Office
Internship
InOrg Global is looking for Intern - Data Engineer & AI/ML to join our dynamic team and embark on a rewarding career journey. Learning: Interns are there to learn and gain hands - on experience in a particular field or industry. They may assist with various tasks and projects, shadow experienced professionals, and participate in training sessions. Project Work: Interns often work on specific projects or tasks that align with their educational background and career interests. These projects can vary widely depending on the company and the internship's focus. Supervision: Interns typically report to a supervisor or mentor who provides guidance, sets expectations, and evaluates their performance. Networking: Internships provide opportunities for networking and building relationships within the industry, which can be valuable for future career opportunities. Skill Development: Interns can develop and enhance their skills, including technical, communication, problem - solving, and teamwork skills.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
INR 4.0 - 7.0 Lacs P.A.
Work from Office
Full Time
InOrg Global is looking for Data Engineer - Databricks to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up - to - date with industry standards and technological advancements that will improve the quality of your outputs.
Pune
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Note : Working hours 6:30 pm to 3:30 am IST . Freelancing/ Contractual Role Key Responsibilities: Kubernetes Administration: Manage cluster lifecycle (health, upgrades, autoscaling, certs, resource tuning). Application Runtime Support: Monitor app deployments, ensure uptime, manage namespaces/configs. DigitalOcean Cloud Operations: Administer compute, networking, DNS, firewalls, backups; respond to system alerts. Redis Cache Management: Tune performance, manage availability/scaling, and respond to cache-related issues. MySQL Cluster Management: Administer clusters, replication, schema management, access controls, patching. Infrastructure as Code (IaC): Maintain infrastructure in Terraform, apply GitOps or CI/CD workflows, manage PR reviews. Security & Compliance Oversight: Manage IAM/RBAC, enforce patching/updates, detect drift/misconfigurations. Incident Response: Rapid response to service disruptions, RCA documentation, and resolution ownership. Reporting & Advisory: Produce monthly reports with key metrics, events, and optimization recommendations.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
INR 5.0 - 10.0 Lacs P.A.
Work from Office
Full Time
Key Responsibilities Design, build, and maintain scalable data pipelines using Databricks and Apache Spark Integrate data from various sources into data lakes or data warehouses Implement and manage Delta Lake architecture for reliable, versioned data storage Ensure data quality, performance, and reliability through testing and monitoring Collaborate with data analysts, scientists, and stakeholders to meet data needs Automate workflows and manage job scheduling within Databricks Maintain clear and thorough documentation of data workflows and architecture Work on Databricks-based AI/ML solutions , including machine learning pipelines , in collaboration with data science teams Requirements Experience: 3+ years in data engineering with strong exposure to Databricks , AI/ML , and big data tools Technical Skills: Proficient in Python or Scala for ETL development Strong understanding of Apache Spark , Delta Lake , and Databricks SQL Familiar with REST APIs , including Databricks REST API Cloud Platforms: Experience with AWS , Azure , or GCP Data Modeling: Familiarity with data lakehouse concepts and dimensional modeling Version Control & CI/CD: Comfortable using Git and pipeline automation tools Soft Skills: Strong problem-solving abilities, attention to detail, and teamwork Nice to Have Certifications: Databricks Certified Data Engineer Associate/Professional
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
INR 2.0 - 3.0 Lacs P.A.
Work from Office
Full Time
Experience Level: Fresher Education: B.Tech in Computer Science / Information Technology Start Date: Immediate Joiners About Company; At Avyka, were passionate about empowering organizations to navigate the complexities of digital transformation. Our journey began with a vision to revolutionize how businesses leverage cloud technologies to achieve their goals. Today, were proud to be a leading provider of DevSecOps solutions, specializing in Harness.io, a powerful continuous delivery platform. Our strategic partnership with Harness.io is at the core of our success. We have a deep understanding of Harness.ios capabilities and best practices, enabling us to deliver tailored solutions that meet the unique needs of our clients. Avyka is a dedicated system integrator specializing in harnessing the full potential of Harness to revolutionize cloud infrastructure and enhance DevSecOps practices. Our mission is to drive business agility, foster innovation, and ensure seamless integration across every stage of your software delivery lifecycle. About the Role We are looking for highly motivated and enthusiastic fresh graduates to join our growing DevOps team. This is a remote opportunity designed to kick-start your career in a dynamic, technology-driven environment. Key Responsibilities Assist in building, deploying, and maintaining CI/CD pipelines. Support configuration management and automation tasks. Monitor system performance and availability. Collaborate with development and QA teams to streamline operations. Work on cloud platforms such as AWS, Azure, or GCP under guidance. Learn and implement infrastructure as code (IaC) practices using tools like Terraform or Ansible. Required Qualifications B.Tech in Computer Science or Information Technology. Basic understanding of Python , SQL,Linux, Git, and cloud computing concepts. Familiarity with any scripting language (Bash, Python, etc.). Eagerness to learn DevOps tools and practices. Good communication skills and the ability to work in a remote team environment.
FIND ON MAP
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.