Inference Labs

4 Job openings at Inference Labs
Data Engineer Not specified 4 - 7 years None Not disclosed On-site Part Time

Job Category: Data Engineer Job Type: Hybrid Job Location: Bangalore Job Experience: 4-7 years We are seeking a Data Engineer to join our growing team. The Data Engineer will be responsible for designing, developing, and maintaining our ETL pipelines and managing our database systems. The ideal candidate should have a strong background in SQL, database design, and ETL processes . Responsibilities for the job Key Responsibilities: – Analyse the different source systems, profile data, understand, document & fix Data Quality issues Gather requirements and business process knowledge to transform the data in a way that is geared towards the needs of end users Write complex SQLs to extract & format source data for ETL/data pipeline Design, implement, and maintain systems that collect and analyze business intelligence data. Design and architect an analytical data store or cluster for the enterprise and implement data pipelines that extract, transform, and load data into an information product that helps the organization reach strategic goals. Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration Design, Develop and Test ETL/Data pipelines Design & build metadata-based frameworks needs for data pipelines Write Unit Test cases, execute Unit Testing and document Unit Test results Manage and maintain the database, warehouse, & cluster with other dependent infrastructure. Expertise in managing and optimizing Spark clusters, along with other implementations of Spark. Strong programming skills in Python and Py-spark . Strong proficiency in SQL and experience with relational databases (PostgreSQL, MySQL, Oracle, etc.) and NoSQL databases (MongoDB, Cassandra, DynamoDB). Knowledge of data modelling techniques such as star/snowflake, data vault, etc. Knowledge of semantic modelling Strong problem-solving skills – Be able to hone business acumen with a capacity for straddling between macro business strategy to micro tangible data and AI products. Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources Technologies preferred – Azure, Databricks Eligibility Criteria for the Job Education: – B.E/B.Tech in any specialization, BCA, MTech in any specialization, MCA. Primary Skill: – 1. SQL 2. Databricks 3. Any one of the cloud experiences (AWS, Azure, GCP). 4. Python, Pyspark Management Skills: – 1. Ability to handle given tasks and projects simultaneously in an organized and timely manner. Soft Skills: – 1. Good communication skills, verbal and written. 2. Attention to details. 3. Positive attitude and confidence.

Data Engineer Bengaluru,Karnataka,India 4 - 7 years Not disclosed On-site Full Time

Job Category: Data Engineer Job Type: Hybrid Job Location: Bangalore Job Experience: 4-7 years We are seeking a Data Engineer to join our growing team. The Data Engineer will be responsible for designing, developing, and maintaining our ETL pipelines and managing our database systems. The ideal candidate should have a strong background in SQL, database design, and ETL processes. Key Responsibilities Responsibilities for the job Analyse the different source systems, profile data, understand, document & fix Data Quality issues. Gather requirements and business process knowledge to transform the data in a way that is geared towards the needs of end users. Write complex SQLs to extract & format source data for ETL/data pipeline. Design, implement, and maintain systems that collect and analyze business intelligence data. Design and architect an analytical data store or cluster for the enterprise and implement data pipelines that extract, transform, and load data into an information product that helps the organization reach strategic goals. Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration. Design, Develop and Test ETL/Data pipelines. Design & build metadata-based frameworks needs for data pipelines. Write Unit Test cases, execute Unit Testing and document Unit Test results. Manage and maintain the database, warehouse, & cluster with other dependent infrastructure. Expertise in managing and optimizing Spark clusters, along with other implementations of Spark. Strong programming skills in Python and Py-spark. Strong proficiency in SQL and experience with relational databases (PostgreSQL, MySQL, Oracle, etc.) and NoSQL databases (MongoDB, Cassandra, DynamoDB). Knowledge of data modelling techniques such as star/snowflake, data vault, etc. Knowledge of semantic modelling. Strong problem-solving skills Be able to hone business acumen with a capacity for straddling between macro business strategy to micro tangible data and AI products. Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources. Technologies preferred: Azure, Databricks Eligibility Criteria for the Job Education B.E/B.Tech in any specialization, BCA, MTech in any specialization, MCA. Primary Skill SQL Databricks Any one of the cloud experiences (AWS, Azure, GCP). Python, Pyspark Management Skills Ability to handle given tasks and projects simultaneously in an organized and timely manner. Soft Skills Good communication skills, verbal and written. Attention to details. Positive attitude and confidence. Show more Show less

Talent Acquisition Specialist/Recruiter Bengaluru 2 - 7 years INR 1.0 - 5.0 Lacs P.A. Work from Office Full Time

We are seeking a dynamic and results-driven Recruiter with 2 years of experience in IT and Data hiring. The ideal candidate will have a proven track record in managing end-to-end recruitment for both contract and permanent roles, with a strong understanding of technical skill sets and market trends. Key Responsibilities: Source, screen, and shortlist candidates for IT and Data roles across various technologies. Manage full-cycle recruitment for both contract and permanent positions. Collaborate with hiring managers to understand job requirements and team dynamics. Utilize job boards, social media, and networking to attract top talent, we'll versed with using job portals for sourcing. Maintain and update candidate databases and recruitment reports. Ensure a positive candidate experience throughout the hiring process. Required Skills & Experience: 2 years of hands-on experience in IT and Data recruitment. Experience in both contract and permanent staffing models. Familiarity with ATS tools and sourcing platforms (eg, LinkedIn, Naukri, Indeed). Strong understanding of technical roles such as developers, data analysts, engineers, etc Soft Skills: Excellent communication and interpersonal skills. Strong organizational and time management abilities. Proactive, adaptable, and able to work in a fast-paced environment. High emotional intelligence and stakeholder management skills. Problem-solving mindset with attention to detail.

GCP Security Engineer haryana 2 - 6 years INR Not disclosed On-site Full Time

We are looking for an experienced GCP Security Engineer who will be responsible for designing, implementing, and managing security solutions within Google Cloud Platform (GCP). The ideal candidate should possess a thorough understanding of cloud security best practices, compliance frameworks, and practical experience with GCP security services. Your main responsibilities will include designing secure cloud architectures on GCP in adherence to security best practices and standards such as CIS Benchmarks and NIST guidelines. You will also be tasked with integrating security by design across cloud services and applications. In addition, you will deploy and configure various GCP security services like Cloud IAM, VPC Service Controls, Cloud Armor, Security Command Center, Cloud KMS, and Cloud HSM. Regular vulnerability assessments and penetration testing on GCP resources will be part of your routine, along with remediating identified vulnerabilities and providing security recommendations. Ensuring GCP environment compliance with security standards such as ISO 27001, SOC 2, HIPAA, and GDPR will be essential. You will also be involved in supporting internal and external security audits and implementing remediation plans. Automation of security tasks using scripting languages like Python, Bash, and infrastructure-as-code tools such as Terraform and Cloud Deployment Manager will be required. Furthermore, you will configure and manage security logging and monitoring tools for detecting, analyzing, and responding to security events. Collaboration with engineering, operations, and development teams to integrate security into the software development lifecycle (DevSecOps) is crucial. You will also communicate security best practices to technical and non-technical stakeholders. Staying up-to-date on the latest GCP security features, vulnerabilities, and emerging threats is expected. You will evaluate and recommend new security tools and technologies. In the event of security incidents, you will participate in security incident response activities, including investigation, containment, eradication, and recovery. The required qualifications for this role include a Bachelor's degree in Computer Science, Information Security, or a related field, along with 5+ years of experience in IT security focusing on cloud security. A minimum of 2 years of hands-on experience with GCP security services and best practices is necessary. Strong knowledge of cloud security concepts, principles, and technologies is also essential, along with experience in security hardening, vulnerability management, and incident response. Familiarity with security compliance frameworks and regulations such as ISO 27001, SOC 2, HIPAA, and GDPR is required. Excellent communication and collaboration skills are a must, and possessing a GCP Security Engineer or Professional certification is preferred. Bonus qualifications for this role include experience in Insights Management for reporting and security analytics, strong knowledge of SQL to support Big Data teams in managing and securing large-scale data environments, and familiarity with data visualization tools for security insights such as Looker and Tableau.,