Datametica Birds are a powerful suite of data validation and migration tools designed to streamline the data management process and help businesses make better decisions. Eagle: The Planner • Analyzes legacy data warehouses and identifies patterns. • Provides an automated assessment of your existing data warehouse (e.g., Teradata, Oracle, Netezza) in just a few weeks. • Delivers precise estimates for migration duration, resource utilization, and associated costs. Raven: The Transformer • 100% automated complex code transformation for rapid migration. • Translates legacy ETL, data warehouse, and analytics code to modern platforms • Converts workloads from Teradata, Oracle, and Netezza to cloud-native platforms like Google BigQuery, Azure Synapse, Snowflake, and AWS Redshift. Pelican: The Validator • 100% accuracy with data comparisons at both row and granular cell levels. • 85% savings in time and cost through automated validation. • No data movement prevents breaches and corruption while ensuring compliance with data privacy and residency regulations. https://www.onixnet.com/datametica-birds-product-suite/
Not specified
INR 50.0 - 65.0 Lacs P.A.
Hybrid
Full Time
Role: Presales Data Architect: Full time ProfessionalSummary: Onix is seeking an experienced Senior Cloud Data Architect that will play a key role in designing and implementing data solutions that meet client business needs and support the overall cloud data architecture. The ideal candidate will have a strong understanding of data architecture, data modeling, and data management, with a significant focus on data warehouse design and development. The candidate will have experience in cloud platforms such as AWS and GCP.Scope/Level of Decision Making: This is an exempt position operating under limited decision-making and supervision. Position performs a variety of assigned activities, referring more complex issues to the manager. Presales Data Architect: Core ResponsibilitiesSolution Design: Leading the pre-sales design of enterprise on-premise, off-premise cloud, and hybrid customer solutions. Creating solution responses against RFx (Request for Information) and proactive opportunities.Customer Engagement: Engaging with customers early in their transformation journey to establish credentials as a partner. Participating in presentations to customers regarding proposed solutions. Providing on-site client consulting, pre-sales support, and technical assistance.Technical Leadership: Providing technical leadership to the implementation team and ensuring project teams remain on course with respect to technical direction. Driving technical certification and validation for opportunities across customer solutions and architecture.Sales Support: Supporting the sales of data projects, particularly on cloud platforms like Azure. Working directly with the sales team to propose solutions to customers to achieve business outcomes. Contributing to practice growth by creating white papers, best practices, and frameworks. Opportunity Management: Managing multiple active opportunities at a time. Vetting opportunities and establishing the technology architecture. Maintaining and updating presales trackers with details including funnel, resource requirements, ACV (Annual Contract Value), TCV (Total Contract Value), etc.Collaboration and Communication: Working in alignment with bid solutions architects to understand the solution requirements and needs of the customer's business units[1]. Resolving technical and design conflicts and issues. Communicating effectively with business partners, including those at the C-suite level.Key Skills and ExperiencesTechnical Expertise: Strong understanding of data and analytics, AI/ML (Artificial Intelligence/Machine Learning), and Generative AI solutions[3]. Experience with cloud platforms like Azure, AWS, and GCP. Knowledge of industry trends and customer challenges in areas like AI/ML/GenAIPresales Experience: Solid experience in a business growth or sales role in analytics, AI/ML, and Gen AI. Consultative selling experience is essential.Communication and Presentation Skills: Excellent oral and written communication skills, including executive presentation and persuasion skills. Ability to prepare proficient presentations for clients.Analytical Skills: Ability to analyze system requirements and understand customer requirements. Performing SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis.General Skills: Adaptability to change solutions with the dynamics of real progress[1]. Ability to interface at all levels of an organization.In essence, the Presales Data Architect role requires a blend of technical expertise, sales acumen, and strong communication skills to effectively design and propose data solutions that meet customer needs and drive business growth.Education: Bachelor's Degree or equivalent experience required.Travel Expectation: Up to 15%It is the policy of Onix to ensure equal employment opportunity in accordance with the Ohio Revised Code 125.111 and all applicable federal regulations and guidelines. Employment discrimination against employees and applicants due to race, color, religion, sex, (including sexual harassment), national origin, disability, age (40 years old or more), military status, or veteran status is illegal. Onix will only employ those who are legally authorized to work in the United States or Canada. This is not a position for which sponsorship will be provided. Individuals with temporary visas such as E, F-1, H-1, H-2, L, B, J, or TN or who need sponsorship for work authorization now or in the future, are not eligible for hire.
Not specified
INR 15.0 - 30.0 Lacs P.A.
Work from Office
Full Time
Job descriptionLocation: PuneExp: 6-10 YearsWe are looking to hire a Data Engineer with strong hands-on experience in SQL, ETL, Pyspark, GCP,Required Past Experience:3+ Years experience in ETL pipelines experience along with any GCP cloud experience along with pyspark.Experience in Shorked on at least one development project from ETL Perspective.File Processing usiell/Python ScriptingHand-On Experience to write Business Logic SQL or PL/SQLETL Testing and TroubleshootingGood to have experience on Building a Cloud ETL PipeLineHands-on experience in Code Versioning Tools like Git , SVN..Good Knowledge of Code Deployment Process and DocumentationRequired Skills and Abilities:Mandatory Skills - Hands-on and deep experience working in ETL, GCP cloud (AWS/ Azure/ GCP) , PysparkSecondary Skills -Strong in SQL Query and Shell ScriptingBetter Communication skill to understand business requirements from SME.Basic knowledge of data modelingGood Understanding of E2E Data Pipeline and Code OptimizationHands on experience in Developing ETL PipeLine for heterogeneous sourcesGood to have experience on Building a Cloud ETL PipeLine
Not specified
INR 8.0 - 18.0 Lacs P.A.
Hybrid
Full Time
We are seeking a highly skilled Senior Engineer with expertise in ETL processes, SQL, GCP (Google Cloud Platform), and Python. As a Senior Engineer, you will play a key role in the design, development, and optimization of data pipelines and workflows that drive business insights and analytics. You will collaborate closely with cross-functional teams to ensure data systems are scalable, robust, and efficient.Key Responsibilities:Design & Develop ETL Pipelines: Build, maintain, and optimize scalable ETL workflows to ingest, transform, and load large datasets using best practices.Database Management: Write efficient, optimized SQL queries to extract, manipulate, and aggregate data from relational databases. Design and implement database schemas for optimal performance.Cloud Infrastructure: Utilize Google Cloud Platform (GCP) services, such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage, to develop and manage cloud-based data solutions.Automation & Scripting: Use Python to automate processes, build custom data transformation logic, and integrate with various data systems and services.Performance Tuning: Ensure the performance of data pipelines and queries are optimized for speed and cost. Troubleshoot issues, implement best practices, and improve system performance.Collaboration & Mentorship: Collaborate with data analysts, data scientists, and other stakeholders to understand data requirements. Provide mentorship and guidance to junior engineers.Data Quality & Governance: Ensure high-quality data through validation, error handling, logging, and monitoring of ETL processes. Implement data governance best practices to maintain consistency and integrity.Documentation & Reporting: Document data pipeline designs, coding standards, and best practices. Create reports for stakeholders to provide insights into data processing activities.Required Skills and Qualifications:ETL Expertise: Strong experience in building, deploying, and optimizing ETL processes with tools like Apache Airflow, Dataflow, or custom Python scripts.SQL Proficiency: Advanced SQL skills with experience in writing complex queries, optimizing performance, and working with large datasets.Cloud Platform (GCP): Deep understanding and hands-on experience with Google Cloud Platform, specifically BigQuery, Cloud Storage, Pub/Sub, Dataflow, and other data-related services.Python: Proficient in Python, especially for data manipulation, ETL automation, and integration with cloud-based solutions.Data Engineering Best Practices: Familiarity with modern data engineering frameworks, version control, CI/CD pipelines, and agile methodologies.Problem Solving: Strong analytical and troubleshooting skills with a focus on identifying solutions to complex data engineering challenges.Communication: Excellent communication skills, able to work effectively in a team and engage with non-technical stakeholders.Preferred Qualifications:Experience with other cloud platforms such as AWS or Azure.Knowledge of data lake and data warehouse architectures.Familiarity with containerization (Docker, Kubernetes) and orchestration tools.Understanding of data privacy, security, and compliance best practices.Education & Experience:Bachelors degree in Computer Science, Engineering, or a related field (Master’s preferred).5+ years of experience in data engineering or a related technical field.Proven experience in designing and implementing data solutions in a cloud environment.Role & responsibilities
Not specified
INR 1.0 - 5.0 Lacs P.A.
Work from Office
Full Time
Job Summary:We are looking for a dynamic and detail-oriented Company Secretary (CS) + Legal Executive with 0-3 years of experience to join our team in Pune, India. The ideal candidate will have a strong understanding of company secretarial practices, corporate governance, and legal compliance. They will support the board of directors, handle legal documentation, ensure regulatory compliance, and provide legal advice on corporate matters.Key Responsibilities:Company Secretary (CS) Functions:Ensure compliance with corporate laws and regulations, including Companies Act, SEBI guidelines, and other applicable laws.Organize and manage board meetings, AGMs, and committee meetings, including preparing agendas, minutes, and resolutions.Maintain statutory records and registers as per regulatory requirements.Handle filings with regulatory authorities, including MCA, ROC, SEBI, and Stock Exchanges.Provide corporate governance advice to the board of directors.Legal Advisory & Compliance:Draft, review, and negotiate contracts, agreements, and legal documents.Provide legal support in commercial transactions, mergers & acquisitions, and compliance audits.Ensure compliance with labor laws, environmental regulations, and industry-specific laws.Handle litigation management, including coordinating with external legal counsel.Advise on legal risks, business strategies, and corporate policies.Qualifications & Skills:Education: Qualified Company Secretary (CS) from ICSI, with a Bachelors/Master’s degree in Law (LLB/LLM) preferred.Experience: 2-4 years of experience in CS and legal roles, preferably in a corporate environment.Knowledge: Strong understanding of corporate laws, regulatory compliance, and legal documentation.Skills: Excellent communication, negotiation, and interpersonal skills.Attention to Detail: Ability to manage multiple priorities and maintain accuracy under tight deadlines
Not specified
INR 20.0 - 35.0 Lacs P.A.
Hybrid
Full Time
Job Description:We are looking for a highly skilled Lead Data Engineer with 8-12 years of experience to lead ateam in building and managing advanced data solutions. The ideal candidate should haveextensive experience with SQL, Teradata, Ab-Initio, and Google Cloud Platform (GCP).Key Responsibilities: Lead the design, development, and optimization of large-scale data pipelines,ensuring they meet business and technical requirements. Architect and implement data solutions using SQL, Teradata, Ab-Initio, and GCP, ensuring scalability, reliability, and performance. Mentor and guide a team of data engineers in the development and execution of ETL processes and data integration solutions. Collaborate with cross-functional teams (e.g., data scientists, analysts, productmanagers) to define data strategies and deliver end-to-end data solutions. Take ownership of end-to-end data workflows, from data ingestion to transformation, storage, and accessibility. Lead performance tuning and optimization efforts for complex SQL queries and Teradata database systems. Design and implement data governance, quality, and security best practices to ensure data integrity and compliance. Manage the migration of legacy data systems to cloud-based solutions on Google Cloud Platform (GCP). Ensure continuous improvement and automation of data pipelines and workflows. Troubleshoot and resolve issues related to data quality, pipeline performance, and system integration. Stay up-to-date with industry trends and emerging technologies to drive innovation and improve data engineering practices within the team.Required Skills: 8-12 years of experience in data engineering or related roles. Strong expertise in SQL, Teradata, and Ab-Initio. In-depth experience with Google Cloud Platform (GCP), including tools like BigQuery,Cloud Storage, Dataflow, etc. Proven track record of leading teams and projects related to data engineering and ETL pipeline development. Experience with data warehousing and cloud-native storage solutions. Strong analytical and problem-solving skills. Experience in setting up and enforcing data governance, security, and compliance standards.Preferred Skills: Familiarity with additional cloud services (AWS, Azure). Experience with data modeling and metadata management. Knowledge of big data technologies like Hadoop, Spark, etc. Strong communication skills and the ability to collaborate effectively with both technical and non-technical teams.
Not specified
INR 30.0 - 45.0 Lacs P.A.
Hybrid
Full Time
Job Description:We are seeking a highly experienced Data Architect with 15-20 years of experience to leadthe design and implementation of data solutions at scale. The ideal candidate will have deepexpertise in cloud technologies, particularly GCP, along with a broad skill set in SQL,BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, DLP, Dataproc, Cloud Composer,Python, ETL, and big data technologies like MapR/Hadoop, Hive, Spark, and Scala.Key Responsibilities: Lead the design and implementation of complex data architectures across cloud platforms, ensuring scalability, performance, and cost-efficiency. Architect data solutions using Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP. Design and optimize ETL - Abinitio processes and data pipelines using Python and related technologies, ensuring seamless data integration across multiple systems. Work with big data technologies including Hadoop (MapR), Hive, Spark, and Scala to build and manage large-scale, distributed data systems. Oversee the end-to-end data flow from ingestion to processing, transformation, and storage, ensuring high availability and disaster recovery. Lead and mentor a team of engineers, guiding them in adopting best practices in data architecture, security, and governance. Define and enforce data governance, security, and compliance standards to ensure data privacy and integrity. Collaborate with cross-functional teams to understand business requirements and translate them into data architecture and technical solutions. Design and implement data lake, data warehouse, and analytics solutions to support business intelligence and advanced analytics. Lead the integration of cloud-native tools and services for real-time and batch processing, using Pub/Sub, Dataproc, and Cloud Composer. Conduct performance tuning and optimization for SQL, BigQuery, and big data technologies to ensure efficient query execution and resource usage. Provide strategic direction on new data technologies, trends, and best practices to ensure the organization remains competitive and innovative.Required Skills: 15-20 years of experience in data architecture, data engineering, or related roles, with a focus on cloud solutions. Extensive experience with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP.Strong Experience in ETL - Abinitio. Proficient in SQL and experience with cloud-native data storage and processing technologies (BigQuery, Hive, Hadoop, Spark). Expertise in Python for ETL pipeline development and data manipulation. Solid understanding of big data technologies such as MapR, Hadoop, Hive, Spark, and Scala. Experience in designing and implementing scalable, high-performance data architectures and data lakes/warehouses. Deep understanding of data governance, security, privacy (DLP), and compliance standards. Proven experience in leading teams and delivering large-scale data solutions in cloud environments. Excellent problem-solving, communication, and leadership skills. Ability to work with senior business and technical leaders to align data solutions with organizational goals.Preferred Skills: Experience with other cloud platforms (AWS, Azure). Knowledge of machine learning and AI data pipelines. Familiarity with containerized environments and orchestration tools (e.g., Kubernetes). Experience with advanced analytics or data science initiatives.
Not specified
INR 20.0 - 35.0 Lacs P.A.
Hybrid
Full Time
Key Responsibilities: Design, develop and maintain data models for the Enterprise Semantic Layer. Implement and manage security governance within the Looker environment. Collaborate with analysts to understand business requirements and translate them intoanalytical products. Optimize data access to improve performance. Conduct experiments and analyze results to refine solutions. Maintain and update existing analytical solutions to ensure they remain effective. Stay updated with the latest developments in data science and AI/ML . Create and maintain documentation for solutions and processes. Design, maintain and optimize LookML models, dashboards, and reports for efficiency andaccuracy. Working Knowledge on LSP. Monitor and troubleshoot Looker Dashboard performance and resolve technical issues. Ensure adherence to data governance and security policies within Looker. Implement and monitor access controls to safeguard sensitive data. Support data validation, quality assurance Provide training and support to end-users for effective utilization of Looker features. Act as the primary point of contact for Looker-related inquiries and troubleshooting. Work closely with business teams to understand reporting needs and provide data-drivensolutions. Stay updated on Looker updates, best practices, and new features to maximize platformcapabilities. Excellent communication skills, analytical Skills, understand business requirement in variousdomains Engage with customers and lead business intelligence and data analytics discussions.Required ExperienceOverall 5+ years of experience in IT industry with focus on looker Expert for last 3+ years Demonstrated capability as Looker Administrator in multiple large projects Good understanding on data modelling and ETL processes on cloud (preferably on GCP) Prior experience of Optimising BI systems for performance, scalability, and reliability. Proven experience of designing, building/migrating/modernising enterprise level BIapplications Expertise in building and maintaining complex dashboards and visualisations onpremise/cloud. Wellversed with BI and Data analytics concepts and best practices Experience with scripting languages (Python, JavaScript) is a plus.
Not specified
INR 10.0 - 20.0 Lacs P.A.
Work from Office
Full Time
Location: PuneExp: 3-9 YearsWe are looking to hire a Data Engineer with strong hands-on experience in SQL, ETL, Pyspark, GCP, Required Past Experience:3+ Years experience in ETL pipelines experience along with any GCP cloud experience along with pyspark. Experience in Shorked on at least one development project from ETL Perspective.File Processing usiell/Python Scripting Hand-On Experience to write Business Logic SQL or PL/SQL ETL Testing and TroubleshootingGood to have experience on Building a Cloud ETL PipeLineHands-on experience in Code Versioning Tools like Git , SVN..Good Knowledge of Code Deployment Process and DocumentationRequired Skills and Abilities:Mandatory Skills - Hands-on and deep experience working in ETL, GCP cloud (AWS/ Azure/ GCP) , PysparkSecondary Skills -Strong in SQL Query and Shell Scripting Better Communication skill to understand business requirements from SME. Basic knowledge of data modelingGood Understanding of E2E Data Pipeline and Code Optimization Hands on experience in Developing ETL PipeLine for heterogeneous sourcesGood to have experience on Building a Cloud ETL PipeLine
Not specified
INR 5.0 - 14.0 Lacs P.A.
Work from Office
Full Time
Position: Infosec Analyst Audit & ComplianceExperience: 3 to 10 yearsLocation: PuneKey Responsibility Areas (KRA):Regulatory Compliance & Governance: Ensure adherence to ISO 27001, NIST, SOC 2, GDPR, HIPAA, and enforce security policies.Audit & Risk Management: Lead internal/external audits, manage compliance assessments, and drive risk mitigation.Incident Response & Compliance Monitoring: Work with Security Operations to monitor incidents, ensure compliance, and support investigations.Security Awareness & Training: Develop and implement training programs to strengthen cybersecurity culture.Vendor & Third-Party Security: Assess vendor security risks, ensure contract compliance, and enforce security standards.Business Continuity & Disaster Recovery (BCDR): Support security-related aspects of BCDR, ensuring compliance with recovery objectives.Critical Coordination & Availability: Be available during US business hours for audits, compliance discussions, and security escalations.Roles & Responsibilities:Lead security audits, compliance initiatives, and regulatory assessments.Maintain security policies, documentation, and reporting for compliance readiness.Serve as the primary contact for auditors, legal teams, and regulatory bodies.Oversee remediation efforts for vulnerabilities and drive timely risk mitigation.Monitor security controls, drive continuous improvement, and align compliance with business objectives.Support security incidents and investigations related to compliance risks.Ensure availability for critical discussions, escalations, and audits during US hours.
Not specified
INR 10.0 - 20.0 Lacs P.A.
Hybrid
Full Time
Job Description:We are looking for a Teradata, Ab Initio, and SQL Developer to join our dynamic team. The ideal candidate will have experience in designing, developing, and optimizing ETL solutions using Teradata, Ab Initio, and SQL. You will be responsible for handling large volumes of data, ensuring smooth data flow, and building efficient ETL processes for analytics and reporting.Responsibilities:Design, develop, and maintain ETL processes using Ab Initio and Teradata. Create and optimize complex SQL queries for data extraction, transformation, and loading (ETL) processes. Collaborate with business analysts and other team members to understand data requirements and deliver solutions. Work with the data team to design and implement data models, data pipelines, and automated workflows. Perform data analysis and create data reports using SQL and Teradata. Optimize ETL processes for performance and scalability. Troubleshoot and debug issues related to ETL, data load processes, and database performance. Monitor, maintain, and ensure the integrity and quality of the data pipeline. Document technical specifications, processes, and solutions. Participate in code reviews and contribute to continuous improvement practices. Skills & Qualifications:Proven experience working with Teradata, Ab Initio, and SQL. Strong experience in designing and developing ETL processes and data pipelines. Proficiency in Teradata SQL, complex query optimization, and performance tuning. Experience with Ab Initio development, including graph development, parallel processing, and troubleshooting. Familiarity with data warehousing concepts and technologies. Knowledge of database design, normalization, and indexing techniques. Strong problem-solving and debugging skills. Experience working with large datasets and distributed systems. Excellent written and verbal communication skills. Ability to work collaboratively within a team. Preferred Qualifications:Experience with cloud-based data platforms (AWS, Azure, Google Cloud). Knowledge of data integration tools and technologies. Experience with version control systems like Git. Familiarity with scheduling tools like Control-M or Autosys. Experience in Agile/Scrum methodologies. Education:Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Experience:3+ years of experience as an ETL Developer or similar role. Strong hands-on experience with Teradata, Ab Initio, and SQL in an enterprise environment.
Not specified
INR 9.0 - 17.0 Lacs P.A.
Work from Office
Full Time
Title: Business Development Representative Exp: 5 to 10 yearsLocation: PuneEmployment Type: Full- timeAbout Us:Onix is a trusted cloud consulting company and leading Google Cloud partner that helps companies get the most out of their technology with cloud-powered solutions, best-in-class services, and the Datametica Birds, data migration products that unleash AI potential.We are able to deliver exceptional results for our customers because of our 20+ year partnership with Google Cloud, depth of technology expertise, and IP-driven data and AI solutions.We offer solutions across a wide range of use cases and industries that are tailored to the unique needs of each customer. From advanced cloud security solutions to innovative AI capabilities and data migration products, we have you covered. Our global team of experts are the most reliable, talented and knowledgeable in the industry.Job role: Driving revenue growth: Utilise your expertise in outbound calling, email campaigns, LinkedIn outreach, and regional customer database building to identify and cultivate high-potential leads. Championing innovation: Become a data whiz and masterfully explain the impact of our cloud migration/modernisation and automation solutions.Collaborating with a winning team: Thrive in a dynamic environment fuelled by growth, surrounded by passionate individuals who value your contributions.As our BDR, you'll: Become a lead generation machine: Unearth promising leads and qualify them efficiently. Fuel marketing & sales events: Drive participation and awareness, building connections with prospects. Be a data champion: Master the value proposition of our offerings, advocating for Onix with confidence. Pitch & present like a pro: Showcase our game-changing products (Eagle, Raven,Pelican) with clarity and persuasiveness. Consistently crush goals: Set new standards and exceed expectations, fuelled by a drive for success. Stay ahead of the curve: Continuously learn, adapt, and stay informed about industry trends and competitors.Are you: A seasoned pro with 5-15 years of B2B/Cloud/SaaS/IT sales experience? An outreach master, comfortable with diverse channels and persuasive communication? An adaptable chameleon who thrives in dynamic environments? A goal-crushing machine with a proven track record of exceeding targets? A communication ninja who can handle objections and educate prospects effectively? Comfortable working in US time zones? A graduate with a Bachelor's degree, MBA, or equivalent experience? Fluent in English?Why Join Us? Competitive salary and comprehensive benefits.Opportunities to work with cutting-edge technologies.Collaborative company culture.
Not specified
INR 5.0 - 12.0 Lacs P.A.
Work from Office
Full Time
QA Engineer (Product )Location: India (Pune)Experience:- 3+ YearsKey Responsibilities:Code Flow Understanding: Gain a deep understanding of how code interacts with various components within the system, ensuring comprehensive testing and validation.Test Case Creation & Execution: Analyze project documentation, customer requirements, and product objectives to develop and execute detailed test cases that align with business needs.Bug Investigation & Reporting: Investigate customer-reported issues assigned by the technical team, thoroughly test bugs, and create/manage clear and actionable bug reports.Collaboration with Deployment Teams: Work closely with deployment teams to address system-level issues, collaborate on product design, and provide feedback on the testability of functional elements and designs.Process & Methodology Improvement: Continuously research testing tools, methodologies, and industry trends to enhance existing testing practices and improve processes.SQL Expertise: Leverage a strong understanding of SQL to support testing and data validation across multiple systems.Unit Testing: Review and enhance developer unit tests while writing your own unit tests within the same codebase, ensuring comprehensive coverage and quality.Exploring New Technologies: Investigate new tools, technologies, and testing scenarios, particularly for data warehousing tools and scenarios.Attention to Detail & Self-Motivation: Demonstrate keen attention to detail and a self-motivated approach to problem-solving and task completion.Problem-Solving & Initiative: Exhibit strong problem-solving abilities and a proactive get-things-done” attitude, driving results in complex testing scenarios.Key Skills:Core Testing Skills: Strong knowledge of testing methodologies and techniques.SQL: Proficiency in SQL for effective data validation and troubleshooting.Shell Scripting: Experience with Shell scripting for automation and testing tasks.API Testing: Hands-on experience with API testing and validation.Nice-to-Have Skills:Graph Databases (Neo4j): Familiarity with graph databases and how they work within the testing framework.ETL Tools: Knowledge of ETL (Extract, Transform, Load) tools and their role in data processing.Cloud Technologies (GCP): Experience with Google Cloud Platform (GCP) or other cloud environments.
Not specified
INR 5.0 - 15.0 Lacs P.A.
Work from Office
Full Time
DevOps EngineerJob Summary:As a DevOps Engineer, you will play a key role in overseeing and implementing the DevOps practices within our organization. You will be responsible for collaborating with development, operations, and quality assurance teams to automate and streamline our operations and processes. The ideal candidate should have a strong background in both software development and IT operations, with a focus on building and maintaining CI/CD pipelines, infrastructure as code, and ensuring the overall stability and scalability of our systems.Responsibilities:DevOps Strategy: Develop and implement a comprehensive DevOps strategy to enhance the efficiency and effectiveness of our software development and release processes. Collaborate with stakeholders to define DevOps goals and objectives aligned with business objectives.Continuous Integration/Continuous Deployment (CI/CD): Design, implement, and maintain CI/CD pipelines for automating the build, test, and deployment processes. Ensure the continuous integration and delivery of applications with a focus on reliability and speed.Infrastructure as Code (IaC): Implement and manage infrastructure as code using tools like Helm, Terraform, Ansible, or CloudFormation. Work closely with infrastructure and development teams to automate the provisioning and configuration of infrastructure.Monitoring and Logging: Implement monitoring and logging solutions to proactively identify and resolve issues. Collaborate with teams to analyze system performance and implement improvements.Security: Implement and enforce security best practices for infrastructure and applications. Collaborate with the security team to conduct regular security assessments and audits.Collaboration and Communication: Foster collaboration between development, operations, and QA teams. Communicate effectively with team members, stakeholders, and leadership about DevOps initiatives and improvements.Incident Response and Resolution:Participate in incident response activities and work towards minimizing system downtime.Develop and maintain documentation for incident response procedures.Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. Proven experience in a DevOps or similar role. Strong knowledge of CI/CD tools such as Jenkins, GitLab CI, Maven. Proficiency in scripting languages such as Shell, Python, or Ruby. Proficiency in Infrastructure Management Experience with containerization and orchestration tools, such as Docker, Docker-Compose and Kubernetes. Experience with building monitoring dashboards (e.g. Graphana) Solid understanding of cloud platforms like Google Cloud (Preferred), AWS, Azure. Familiarity with configuration management tools like Ansible, Puppet, or Chef. Knowledge of infrastructure as code principles and tools. Excellent problem-solving and communication skills.Preferred: Relevant certifications in DevOps and GCP. Knowledge of Agile/Scrum methodologies. Familiarity with version control systems such as Git. Knowledge of Database administration Familiarity with Hadoop administration and tuning Understanding of build frameworks Gradle/Maven
Not specified
INR 5.0 - 15.0 Lacs P.A.
Work from Office
Full Time
We are looking for a skilled and motivated Core Java Developer to join our development team. As a Core Java Developer, you will be responsible for designing, implementing, and maintaining Java-based applications. You will work closely with other developers, system architects, and product managers to deliver high-quality solutions to meet business needs.Responsibilities:Design, implement, and maintain Java-based applications and systems. Write clean, scalable, and efficient code using Java and related technologies. Collaborate with cross-functional teams to define, design, and ship new features. Troubleshoot, debug, and optimize existing code. Conduct code reviews and provide constructive feedback to team members. Participate in the full software development lifecycle, from concept to delivery. Work with databases, data structures, and algorithms to optimize performance. Ensure the quality of the application by writing unit tests and performing integration testing. Stay updated on the latest Java technologies and best practices. Required Skills & Qualifications:Proven experience as a Java Developer or similar role. Strong knowledge of Core Java concepts (object-oriented programming, exception handling, multithreading, collections, streams, etc.). Proficiency with Java 8 or later versions. Experience with JVM, Garbage Collection, Concurrency, and Memory Management. Familiarity with Spring Framework (Spring Boot, Spring MVC, etc.) and Hibernate. Knowledge of web services (RESTful APIs, SOAP) and client-server architectures. Familiarity with databases (e.g., MySQL, PostgreSQL, Oracle) and writing complex queries. Understanding of design patterns and software architecture. Ability to write unit tests using JUnit or TestNG. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications:Experience with version control systems (e.g., Git). Familiarity with Agile methodologies (Scrum, Kanban). Knowledge of build tools such as Maven or Gradle. Experience working with cloud platforms like AWS or Azure. Familiarity with front-end technologies (e.g., JavaScript, HTML, CSS) is a plus. Experience with containerization (e.g., Docker) is an advantage. Education:Bachelors degree in Computer Science, Information Technology, or related field, or equivalent work experience. Experience:years of experience in Core Java development. Benefits:Competitive salary. Health insurance, retirement plans, and other benefits. Opportunities for career growth and development. Collaborative and dynamic work environment.
Not specified
INR 18.0 - 30.0 Lacs P.A.
Hybrid
Full Time
We are looking for a Senior/Lead Software Engineer with expertise in Rust, Java, Microservices, GraphQL, and Flink to develop and optimize high-performance backend systems and real-time data pipelines.Key ResponsibilitiesDevelop backend services using Rust and Java (Spring Boot) with optimized API performance.Design and maintain scalable and secure microservices while implementing RESTful and GraphQL APIs.Create efficient resolvers and optimize GraphQL queries to reduce response time.Build real-time data processing pipelines using Apache Flink and integrate with Kafka,Kinesis, and RabbitMQ.Implement stateful Flink applications with effective use of windowing, key-value state,and checkpointing mechanisms.Optimize Flink jobs to ensure high performance, low latency, and fault tolerance.Design APIs for exposing real-time data streams and implement event-driven architectures.Deploy, manage, and monitor applications on cloud platforms (AWS, GCP, or Azure) using Docker and Kubernetes.
1. Are certifications needed?
A. Certifications in cloud or data-related fields are often preferred.
2. Do they offer internships?
A. Yes, internships are available for students and recent graduates.
3. Do they support remote work?
A. Yes, hybrid and remote roles are offered depending on the project.
4. How can I get a job there?
A. Apply via careers portal, attend campus drives, or use referrals.
5. How many rounds are there in the interview?
A. Usually 2 to 3 rounds including technical and HR.
6. What is the interview process?
A. It typically includes aptitude, technical, and HR rounds.
7. What is the work culture like?
A. The company promotes flexibility, innovation, and collaboration.
8. What is their average salary for freshers?
A. Freshers earn between 3.5 to 6 LPA depending on role.
9. What kind of projects do they handle?
A. They handle digital transformation, consulting, and IT services.
10. What technologies do they work with?
A. They work with cloud, AI, cybersecurity, and digital solutions.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Chrome Extension