IQHQ is the intelligence (IQ) headquarters (HQ) of today's organizations which maximizes the potential of Big Data via powering data-driven businesses with foremost intelligence resources. With increasing volume of data being generated every day, only a small fraction of this data is used. The importance of data as one of the most valuable resources is growing and organizations demand solution, which helps them efficiently aggregate, process and analyze Big Data to make right impactful decisions, discover hidden insights, leverage every emerging opportunity and gain inimitable competitive advantage. It's IQHQ. IQHQ enables users to: - Explore aggregated external data in structured form - Import internal data and integrate it with external data - Analyze and visualize data to discover hidden insights and opportunities - Monetize data/acquire data from other users - Request "Custom Intelligence", a flagship custom data sourcing service for organizations with special data needs.
Hyderabad
INR 30.0 - 35.0 Lacs P.A.
Work from Office
Full Time
":" Job Description: We are looking for a highly skilled Senior Data Scientist with 39 years of experience specializing in Python, Large Language Models (LLMs), NLP, Machine Learning, and Generative AI . The ideal candidate will have a deep understanding of building intelligent systems using modern AI frameworks and deploying them into scalable, production-grade environments. You will work closely with cross-functional teams to build innovative AI solutions that deliver real business value. Responsibilities: Design, develop, and deploy ML/NLP solutions using Python and state-of-the-art AI frameworks. Apply LLMs and Generative AI techniques to solve real-world problems. Build, train, fine-tune, and evaluate models for NLP and GenAI tasks. Collaborate with data engineers, MLOps, and product teams to operationalize models. Contribute to the development of scalable AI services and applications. Analyze large datasets to extract insights and support model development. Maintain clean, modular, and version-controlled code using Git. Requirements Must-Have Skills: 310 years of hands-on experience with Python for data science and ML applications. Strong expertise in Machine Learning algorithms and model development. Proficient in Natural Language Processing (NLP) and text analytics. Experience with Large Language Models (LLMs) and Generative AI frameworks (e.g., LangChain, Hugging Face Transformers). Familiarity with model deployment and real-world application integration. Experience with version control systems like Git . Good to Have: Experience with PySpark for distributed data processing. Exposure to MLOps practices and model lifecycle management. Familiarity with cloud platforms such as AWS, GCP, or Azure. Knowledge of vector databases (e.g., FAISS, Pinecone) and embeddings. Educational Qualification: Bacheloror Masterdegree in Computer Science, Data Science, Statistics, or a related field. Benefits Work with cutting-edge technologies in a collaborative and forward-thinking environment. Opportunities for continuous learning, skill development, and career growth. Exposure to high-impact projects in AI and data science. ","
Hyderabad
INR 3.0 - 6.0 Lacs P.A.
Work from Office
Full Time
We are seeking a highlyskilled and experienced Senior Data Engineer to lead the end-to-end developmentof complex models for compliance and supervision. The ideal candidate will havedeep expertise in cloud-based infrastructure, ETL pipeline development, andfinancial domains, with a strong focus on creating robust, scalable, andefficient solutions. Key Responsibilities: -ModelDevelopment: Lead the development of advanced models using AWS services such asEMR, Glue, and Glue Notebooks. -CloudInfrastructure: Design, build, and optimize scalable cloud infrastructuresolutions with a minimum of 5 years of experience. -ETL PipelineDevelopment: Create, manage, and optimize ETL pipelines using PySpark forlarge-scale data processing. -CI/CDImplementation: Build and maintain CI/CD pipelines for deploying andmaintaining cloud-based applications. -Data Analysis:Perform detailed data analysis and deliver actionable insights to stakeholders. -Collaboration:Work closely with cross-functional teams to understand requirements, presentsolutions, and ensure alignment with business goals. -AgileMethodology: Operate effectively in agile or hybrid agile environments,delivering high-quality results within tight deadlines. -FrameworkDevelopment: Enhance and expand existing frameworks and capabilities to supportevolving business needs. -Documentation andCommunication: Create clear documentation and present technical solutions toboth technical and non-technical audiences. Requirements Required Qualifications: -05+ years ofexperience with Python programming. -5+ years ofexperience in cloud infrastructure, particularly AWS. -3+ years ofexperience with PySpark, including usage with EMR or Glue Notebooks. -3+ years ofexperience with Apache Airflow for workflow orchestration. -Solid experiencewith data analysis in fast-paced environments. -Strongunderstanding of capital markets, financial systems, or prior experience in thefinancial domain is a must. -Proficiency withcloud-native technologies and frameworks. -Familiarity withCI/CD practices and tools like Jenkins, GitLab CI/CD, or AWS CodePipeline. -Experience withnotebooks (e.g., Jupyter, Glue Notebooks) for interactive development. -Excellentproblem-solving skills and ability to handle complex technical challenges. -Strongcommunication and interpersonal skills for collaboration across teams andpresenting solutions to diverse audiences. -Ability to thrivein a fast-paced, dynamic environment. Benefits Standard Company Benefits ","
Hyderabad
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Job Overview: We seek a highly skilled Java Full Stack Developer who is comfortable with frontend and backend development. The ideal candidate will be responsible for developing and designing frontend web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for web design features, among other duties. The Java Full Stack Developer will be required to see out a project from conception to final product, requiring good organizational skills and attention to detail. Key Responsibilities: Frontend Development: Design and develop user-facing web applications using modern frontend languages like HTML, CSS, and JavaScript and frameworks like React.js, Angular, or Vue.js. Backend Development: Build and maintain server-side application logic using languages such as Node.js, Python, Ruby, Java, or PHP, and manage database interactions with MySQL, PostgreSQL, MongoDB, or other database systems. API Development and Integration: Develop and integrate RESTful APIs to connect frontend and backend components, ensuring smooth data flow and communication between different parts of the application. Database Management: Design, implement, and manage databases, ensuring data integrity, security, and optimal performance. Version Control and Collaboration: Use Git and other version control systems to track code changes and collaborate with other team developers. Deployment and DevOps: Automate deployment processes, manage cloud infrastructure, and ensure the scalability and reliability of applications through CI/CD pipelines. Security Implementation: Implement security best practices to protect the application from vulnerabilities, including authentication, authorization, and data encryption . Cross-Platform Optimization: Ensure the application is responsive and optimized for different devices, platforms, and browsers. Troubleshooting and Debugging: Identify, diagnose, and fix bugs and performance issues in the application, ensuring a smooth user experience. Collaboration and Communication: Work closely with product managers, designers, and other stakeholders to understand requirements and deliver solutions that meet business needs. Continuous Learning: Stay updated with the latest technologies, frameworks, and industry trends to improve development practices continuously. Requirements Technical Skills: Proficiency in frontend technologies like HTML, CSS, JavaScript, and frameworks like React.js, Angular, or Vue.js. Strong backend development experience with Node.js, Python, Java, or similar languages. Hands-on experience with databases like MySQL, PostgreSQL, MongoDB, or similar. Familiarity with version control systems, notably Git. Experience with cloud services like AWS, Azure, or Google Cloud. Knowledge of CI/CD pipelines and DevOps practices. Understanding of security principles and how to apply them to web applications. Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication skills and the ability to work collaboratively in a team environment. Ability to manage multiple tasks and projects simultaneously. Eagerness to learn new technologies and improve existing skills.
Pune
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
We are seeking a highly skilled and experienced Senior Data Engineer to lead the end-to-end development of complex models for compliance and supervision. The ideal candidate will have deep expertise in cloud-based infrastructure, ETL pipeline development, and financial domains, with a strong focus on creating robust, scalable, and efficient solutions. Key Responsibilities: -Model Development: Lead the development of advanced models using AWS services such as EMR, Glue, and Glue Notebooks. -Cloud Infrastructure: Design, build, and optimize scalable cloud infrastructure solutions with a minimum of 5 years of experience. -ETL Pipeline Development: Create, manage, and optimize ETL pipelines using PySpark for large-scale data processing. -CI/CD Implementation: Build and maintain CI/CD pipelines for deploying and maintaining cloud-based applications. -Data Analysis: Perform detailed data analysis and deliver actionable insights to stakeholders. -Collaboration: Work closely with cross-functional teams to understand requirements, present solutions, and ensure alignment with business goals. -Agile Methodology: Operate effectively in agile or hybrid agile environments, delivering high-quality results within tight deadlines. -Framework Development: Enhance and expand existing frameworks and capabilities to support evolving business needs. -Documentation and Communication: Create clear documentation and present technical solutions to both technical and non-technical audiences. Requirements Requirements Required Qualifications: -05+ years of experience with Python programming. -5+ years of experience in cloud infrastructure, particularly AWS. -3+ years of experience with PySpark, including usage with EMR or Glue Notebooks. -3+ years of experience with Apache Airflow for workflow orchestration. -Solid experience with data analysis in fast-paced environments. -Strong understanding of capital markets, financial systems, or prior experience in the financial domain is a must. -Proficiency with cloud-native technologies and frameworks. -Familiarity with CI/CD practices and tools like Jenkins, GitLab CI/CD, or AWS CodePipeline. -Experience with notebooks (e.g., Jupyter, Glue Notebooks) for interactive development. -Excellent problem-solving skills and ability to handle complex technical challenges. -Strong communication and interpersonal skills for collaboration across teams and presenting solutions to diverse audiences. -Ability to thrive in a fast-paced, dynamic environment. Benefits Benefits Standard Company Benefits ","
Hyderabad
INR 15.0 - 30.0 Lacs P.A.
Work from Office
Full Time
We are seeking a skilled ETL Data Engineer to design, build, and maintain efficient and reliable ETL pipelines, ensuring seamless data integration, transformation, and delivery to support business intelligence and analytics. The ideal candidate should have hands-on experience with ETL tools like Talend , strong database knowledge, and familiarity with AWS services . Key Responsibilities: Design, develop, and optimize ETL workflows and data pipelines using Talend or similar ETL tools. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Integrate data from various sources, including databases, APIs, and cloud platforms, into data warehouses or data lakes. Create and optimize complex SQL queries for data extraction, transformation, and loading. Manage and monitor ETL processes to ensure data integrity, accuracy, and efficiency. Work with AWS services like S3, Redshift, RDS , and Glue for data storage and processing. Implement data quality checks and ensure compliance with data governance standards. Troubleshoot and resolve data discrepancies and performance issues. Document ETL processes, workflows, and technical specifications for future reference. Requirements Bachelor's degree in Computer Science, Information Technology, or a related field. 4+ years of experience in ETL development, data engineering, or data warehousing. Hands-on experience with Talend or similar ETL tools (Informatica, SSIS, etc.). Proficiency in SQL and strong understanding of database concepts (relational and non-relational). Experience working in an AWS environment with services like S3, Redshift, RDS , or Glue . Strong problem-solving skills and ability to troubleshoot data-related issues. Knowledge of scripting languages like Python or Shell scripting is a plus. Good communication skills to collaborate with cross-functional teams.
Gurugram
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Senior Specialist Cloud Engineer - ContactCentre Innovation & GenAI RoleSummary: We areseeking an experienced and highly skilled Senior Specialist Cloud Engineer tojoin our innovative team. In this role, you will be responsible for designing,implementing, and maintaining cloud-based solutions using cutting-edgetechnologies. You will play a crucial role in optimizing our cloudinfrastructure, improving system performance, and ensuring the scalability andreliability of our applications. What youwill do: (Roles & Responsibilities) - Design and implement complexcloud-based solutions using AWS services - Design and optimize database schemasand queries, particularly with DynamoDB - Write, test, and maintain high-quality Python code for cloud-based applications - Able to work on Amazon Connect and integrate Amazon services - Collaborate with cross-functional teamsto identify and implement cloud-based solutions - Ensure security, compliance, and bestpractices in cloud infrastructure - Troubleshoot and resolve complextechnical issues in cloud environments - Mentor junior engineers and contributeto the teams technical growth - Stay up-to-date with the latest cloudtechnologies and industry trends Requirements What you need to succeed: (MUST Haves) - Bachelors degree in Computer Science,Engineering, or a related field - 5-9 years of experience in cloudengineering, with a strong focus on AWS - Extensive experience with Pythonprogramming and software development - Strong knowledge of database systems,particularly DynamoDB - Hands On experience in Amazon Connect - Excellent problem-solving and analyticalskills - Strong communication and collaborationabilities Ideal Candidate will also have: -Experience with containerization technologies (e.g., Docker, Kubernetes) -Knowledge of CI/CD pipelines and DevOps practices -Familiarity with serverless architectures and microservices -Experience with data analytics and big data technologies -Understanding of machine learning and AI concepts -Contributions to open-source projects or technical communities - AWS certifications (e.g., SolutionsArchitect, DevOps Engineer) are a plus -Experience mentoring junior engineers or leading small teams -Strong project management skills and ability to manage multiple priorities If you are passionate about cloud technologies,have a proven track record of delivering innovative solutions, and thrive in acollaborative environment, we want to hear from you. Join our team and helpshape the future of cloud computing! Benefits As per company standards. ","
Hyderabad
INR 2.0 - 5.0 Lacs P.A.
Work from Office
Full Time
Installing, configuring, and troubleshooting of all Windows and mac OS. Securing the network by installing troubleshooting Antivirus related issues regularly updating the Antivirus Configuring and Troubleshooting Local Network Printers. Resolving hardware related issues in printers and other peripherals. Providing the admin rights, Remote desktop access, file and folder access to the users as per the request. Trouble shooting all end-to-end technical problems through the remote tools. Monitoring the compliance status of all the desktops/servers in terms of patch/DATstatus. Troubleshooting issues on Office365 and escalate to proper team Identifying and solving issues on Microsoft products (EXCEL, POWEPOINT, WORD, TEAMS) Troubleshooting issues on Citrix connections, Client VPN. Add devices to AzureAD, create, deploy and managing Intune MDM. Create, deploy, manage the app protection policies, device configuration policies from Intune endpoint manager. Create and manage security firewall, Switches, and ILL. Manage and update McAfee web controls and firewall rules. Maintain and monitor CCTV camera, Access control eSSL. Identify the causes of networking problems, using diagnostic testing software and equipment. Resolve IT tickets regarding computer software, hardware, and application issues on time. Set up equipment for employee use, performing or ensuring proper installation of cables, operating systems, or appropriate software. Install and perform minor repairs to hardware, software, or peripheral equipment. Requirements Good Experience in System administration Technical Support Experience Experience in ITIL process Experience in RIM (Remote Infrastructure Mgmt.) Good knowledge in Virtualization and cloud concepts with VMware and/or Open stack Excellent communication skills
Hyderabad
INR 37.5 - 45.0 Lacs P.A.
Work from Office
Full Time
Job Title: Technical Project Manager Location: Hyderabad Employment Type: Full-time Experience: 10+ years Domain: Banking and Insurance We are seeking a Technical Project Manager to lead andcoordinate the delivery of data-centric projects. This role bridges the gapbetween engineering teams and business stakeholders, ensuring the successfulexecution of technical initiatives, particularly in data infrastructure,pipelines, analytics, and platform integration. Responsibilities: Lead end-to-end project management for data-driven initiatives, including planning, execution, delivery, and stakeholder communication. Work closely with data engineers, analysts, and software developers to ensure technical accuracy and timely delivery of projects. Translate business requirements into technical specifications and work plans. Manage project timelines, risks, resources, and dependencies using Agile, Scrum, or Kanban methodologies. Drive the development and maintenance of scalable ETL pipelines, data models, and data integration workflows. Oversee code reviews and ensure adherence to data engineering best practices. Provide hands-on support when necessary, in Python-based development or debugging. Collaborate with cross-functional teams including Product, Data Science, DevOps, and QA. Track project metrics and prepare progress reports for stakeholders. Requirements Required Qualifications: Bacheloror masterdegree in computer science, Information Systems, Engineering, or related field. 10+ years of experience in project management or technical leadership roles. Strong understanding of modern data architectures (e.g., data lakes, warehousing, streaming). Experience working with cloud platforms like AWS, GCP, or Azure. Familiarity with tools such as JIRA, Confluence, Git, and CI/CD pipelines. Strong communication and stakeholder management skills. Benefits Company standard benefits. ","
Hyderabad
INR 6.0 - 10.0 Lacs P.A.
Work from Office
Full Time
":" Job Title: Java Developer Location: Hyderabad Employment Type: Full-time Experience: 4+ years Domain: Banking and Insurance Key Responsibilities: Design, develop, and maintain scalable Java applications using Spring Boot framework. Build and deploy microservices-based architectures to support modular and efficient software solutions. Develop and optimize database interactions using Hibernate ORM. Collaborate with cross-functional teams including QA, DevOps, and Product Management to deliver end-to-end solutions. Write clean, reusable, and well-documented code following coding standards and best practices. Participate in code reviews, unit testing, and integration testing. Troubleshoot and resolve technical issues in a timely manner. Contribute to continuous improvement by suggesting and implementing new technologies or processes. Support deployments and basic cloud-related operations, working closely with cloud engineers or DevOps teams. Requirements Strong proficiency in Java programming language. Hands-on experience with Spring Boot framework and microservices architecture. Solid knowledge of Hibernate or other ORM frameworks. Understanding of RESTful API development and integration. Basic knowledge of cloud platforms (AWS, Azure, or GCP) and cloud-native application concepts. Experience with relational databases (MySQL, PostgreSQL, Oracle, etc.). Familiarity with version control systems such as Git. Good understanding of software development lifecycle (SDLC) and Agile methodologies. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Benefits Company standard benefits. ","
Hyderabad
INR 50.0 - 60.0 Lacs P.A.
Work from Office
Full Time
We areseeking an experienced Data Solution Architect to lead the design andimplementation of scalable, secure, and high-performing data solutions acrosscloud and hybrid environments. The ideal candidate will bring deep expertise in Data Engineering, APIs, Python, Spark/PySpark , and enterprise cloudplatforms such as AWS and Azure . This is a strategic,client-facing role that involves working closely with stakeholders, engineeringteams, and business leaders to architect and deliver robust data platforms. KeyResponsibilities: Architect end-to-end data solutions across cloud (AWS/Azure) and on-premises environments Develop and integrate RESTful APIs for data ingestion, transformation, and distribution Define data architecture standards, best practices, and governance frameworks Work with DevOps and cloud teams to deploy solutions using CI/CD and infrastructure-as-code Guide and mentor data engineering teams in solution implementation and performance optimization Ensure high availability, scalability, and data security compliance across platforms Collaborate with product owners and stakeholders to translate business needs into technical specifications Conduct architecture reviews, risk assessments, and solution validation Requirements RequiredSkills & Experience: 15 to 22 years of total experience in IT, with at least 5+ years in data architecture roles Strong experience in data processing frameworks and building the ETL solutions Proven expertise in designing and deploying solutions on AWS and Azure cloud platforms Hands-on experience with data integration, real-time streaming , and API-based data access Proficient in data modeling (structured, semi-structured, unstructured data) Deep understanding of data lakes, data warehouses, and modern data mesh/architecture patterns Experience with tools such as Airflow, Glue, Data bricks, Synapse, Redshift, or similar Knowledge of security, compliance, and governance practices in large-scale data platforms Strong communication, leadership, and client-facing skills Benefits Standard Company Benefits ","
Pune
INR 10.0 - 14.0 Lacs P.A.
Work from Office
Full Time
":" Were interested in hearing from people who Have solid hands-on experience of data engineering/ETL principles and practices with Unix/Linux, Ab Initio and other data integration tools and frameworks. Are confident demonstrating coding, design, debugging and problem-solving skills Pride themselves in the ability to mentor and provide technical assistance to team members. Are knowledgeable to apply and promote industry best patterns and practices. Have solid experience in developing and deploying high quality software solutions with comprehensive test coverage without supervision. Are comfortable with estimating development effort for new features Have the ability to lead Level 3 or above support and technical troubleshooting activity. Requirements Tech Skills We use a broad range of tools, languages, and frameworks. We dont expect you to know them all but experience or exposure with some of these (or equivalents) will set you up for success in this team. Hands-on programming experience of 8-12 years in Ab Initio related products and shell scripting. Background with both Batch and continuous flows is highly regarded. Experience with RDBMS (Oracle) and using SQL or other data integration/ETL tools. AWS Cloud Integration knowledge of MQs, REST APIs and Kafka DevOps and production support. Benefits As per company standards. ","
Hyderabad
INR 7.0 - 10.0 Lacs P.A.
Work from Office
Full Time
":" Job Title: PySpark Data Engineer Experience: 5 8 Years Location: Hyderabad Employment Type: Full-Time Job Summary: We are looking for a skilled and experienced PySpark Data Engineer to join our growing data engineering team. The ideal candidate will have 58 years of experience in designing and implementing data pipelines using PySpark , AWS Glue , and Apache Airflow , with strong proficiency in SQL . You will be responsible for building scalable data processing solutions, optimizing data workflows, and collaborating with cross-functional teams to deliver high-quality data assets. Key Responsibilities: Design, develop, and maintain large-scale ETL pipelines using PySpark and AWS Glue . Orchestrate and schedule data workflows using Apache Airflow . Optimize data processing jobs for performance and cost-efficiency. Work with large datasets from various sources, ensuring data quality and consistency. Collaborate with Data Scientists, Analysts, and other Engineers to understand data requirements and deliver solutions. Write efficient, reusable, and well-documented code following best practices. Monitor data pipeline health and performance; resolve data-related issues proactively. Participate in code reviews, architecture discussions, and performance tuning. Requirements 58 years of experience in data engineering roles. Strong expertise in PySpark for distributed data processing. Hands-on experience with AWS Glue and other AWS data services (S3, Athena, Lambda, etc.). Experience with Apache Airflow for workflow orchestration. Strong proficiency in SQL for data extraction, transformation, and analysis. Familiarity with data modeling concepts and data lake/data warehouse architectures. Experience with version control systems (e.g., Git) and CI/CD processes. Ability to write clean, scalable, and production-grade code. Benefits Company standard benefits. ","
FIND ON MAP
Company Reviews
Anudeep Fitzgerald
9 months ago
I've been with data economy for 2yrs (still going ), and it's been an incredible journey. The work environment fosters growth and innovation, with sup...
raviteja bavanari
9 months ago
Working with DATAECONOMY is a thrilling and enjoyable experience so far in this long journey of mine. Learning here is very high and competitive. Ever...
suryavamsi vempati
10 months ago
Working at Data Economy has been a truly rewarding experience. I’ve joined this company as a fresher, through out working here company encourages it’s...
vivek reddy
a year ago
At Dataeconomy, we prioritize creating an familial atmosphere where every member feels valued and supported. Our work environment is designed to be fa...
Rohith kumar Gaddi 007
a year ago
Working with Data Economy is my Pleasure. As a fresher I have joined the Company and there is so much I have upskilled my career. It has the flexibili...
ankit gaurav singh
a year ago
DATAECONOMY fosters a friendly and welcoming environment, making it a pleasure to work here. Engaging projects keep you motivated and excited about yo...
Amol Bhosale
a year ago
I am working from last one and half year in Data Economy. I seems that organization has support to learn new cutting edge technology that will help u...
nikhil tyagi
a year ago
I believe data economy is the best place for freshers. It has a great work culture and flexibility for work life balance. We get opportunity to put ou...
suchit hemgire
a year ago
"I'm proud to be a part of "Data Economy" It's an amazing workplace with a fantastic team. The positive work environment and commitment to excellence...
pavan kumar
a year ago
I've thrived at DATAECONOMY thanks to its unwavering commitment to employee development and fostering a genuinely friendly team atmosphere. Equally im...
Chukka Dileepbabu
a year ago
In my time at DATAECONOMY, I've discovered a workplace that goes beyond expectations. The diverse culture, commitment to learning, and inspiring value...
Dipesh Thakur
a year ago
Great company for freshers to learn and explore new technologies. Intelligent and smart seniors gives support to the freshers for their work and learn...
Ajinkya Kulkarni
a year ago
Proud to be part of Data Economy. DataEconomy is a game-changer in the data sector. Work culture is good. Personal & professional balance is good
Shubham Jori
11 months ago
Dataeconomy is a excellent organisation to work into with ample opportunities in various technologies and also the work environment is awesome.Highly ...
Pradhyumn Gawade
9 months ago
Amazing Work Culture with appropriate work and human resource management. Opportunities to grow with proper training under company's wing
Krushna Bodake
10 months ago
Working with Data Economy is my Pleasure. I have upskilled my career. It has the flexibility for Work Life Balance. There are people with huge knowle...
Pooja Shelke
9 months ago
Great experience with Data Economy ! Team is highly supportive and great projects to work on...
KRUSHNA BODAKE
a year ago
The work life balance is good as well as current employees and colleagues supportive and involving.
Dinesh choudhary
a year ago
It's very good experience in Company.Good work culture and good professional and personal life balance.Good Career opportunity.
swapnil powar
a year ago
Very good place to work.Better working culture. Work flexibility with better cutting edge technologies.
Saurabh Dhoke
a year ago
It is a really good organization, it's growing fast now, management is really good and everyone is supportive here
ayush jaiswal
11 months ago
Great place to work. Members are very supportive. Good learning environment
ayush katiyar
9 months ago
Great place to work, exposure to variety of projects and technologies.
ajay umbarkar
10 months ago
Excellent work culture.More learning opportunities.Supportive management.
Rajesh Soora
a year ago
Great place to work. Good Work culture.
Aniket Adwankar
9 months ago
Great place to work
Narsimha Rao
a year ago
Good place to work.
Prakhar Agrawal
a year ago
Great place to work.
Lok Sandeep
a year ago
Personally liked the culture and employee friendly policies. Thanks to Human Resources Team (Sailaja Saranu, Enosh Digumarthi and Shruthi Mamidi)
Chaitanya
2 weeks ago
Venkatesh N
a month ago
SR Desu
7 months ago
Ravi Sankar
7 months ago
Nune Venkata Sreenivasulu
8 months ago
rasik pawar
a year ago
Ram Kathela
a year ago
Aravind Reddy (Aru)
a year ago
Shubham Undirwade
a year ago
Harsh Kumar
a year ago
Ravi Kumar Javvaji
a year ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.