IQHQ is the intelligence (IQ) headquarters (HQ) of today's organizations which maximizes the potential of Big Data via powering data-driven businesses with foremost intelligence resources. With increasing volume of data being generated every day, only a small fraction of this data is used. The importance of data as one of the most valuable resources is growing and organizations demand solution, which helps them efficiently aggregate, process and analyze Big Data to make right impactful decisions, discover hidden insights, leverage every emerging opportunity and gain inimitable competitive advantage. It's IQHQ. IQHQ enables users to: - Explore aggregated external data in structured form - Import internal data and integrate it with external data - Analyze and visualize data to discover hidden insights and opportunities - Monetize data/acquire data from other users - Request "Custom Intelligence", a flagship custom data sourcing service for organizations with special data needs.
Hyderabad
INR 20.0 - 25.0 Lacs P.A.
Work from Office
Full Time
Position :Senior .Net Developer Location : Hyderabad Modeof Employment: Fulltime Salary: Industry Standard. Job Description We are seeking a talented and experienced Senior .NET developer to join our growing team. You will be responsible for the design, development, deployment, and maintenance of software applications using the .NET framework. You will work closely with other developers, designers, and product managers to deliver high-quality, scalable solutions. Responsibilities Participate in requirements gathering and analysis sessions Design, develop, test, and deploy web applications using ASP.NET, ASP.NET Core, and C# Develop and consume Webservices, WCF Services, and Core Microservices. Work with XML and perform XSLT transformations (3+ years of experience required) Manage data using SQL databases (4+ years of experience) Experience with PL/SQL is a plus. Utilize source code management tools like Git, Bitbucket, and SVN Collaborate effectively in an Agile development environment using Jira and Bitbucket Experience with cloud platforms like Azure or AWS is a must. 3+ years of web application development is must. It is good to have Development Experience in WPF. Requirements Qualifications 5-9 years of experience in software development using the .NET Framework 4.8 and above, .NET 6 and above, .netcore 2 and above. Proven ability to write clean, maintainable, and well-documented code Strong understanding of object-oriented programming principles Experience with unit testing and code coverage is a plus Excellent communication and collaboration skills Ability to work independently and as part of a team Benefits Company Standard Benefits. ","
Pune
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
We are seeking a dynamic and experienced Lead Software Test Engineer with a strong background in Selenium and API Automation Testing using Java. As a key member of our testing team, you will be responsible for leading and executing test strategies, mentoring team members, and ensuring the delivery of high-quality software products. The ideal candidate should have in-depth knowledge of automation testing, excellent leadership skills, and a passion for driving excellence in testing practices. Requirements Key Responsibilities: Define and implement test strategies, methodologies, and best practices for Selenium and API automation testing by developing an effective automation framework. Design, develop, and maintain robust and scalable automation frameworks using Selenium WebDriver and Java. Create and execute automated test scripts for web applications, ensuring comprehensive test coverage. Develop and implement automated tests for APIs and microservices using tools such as Rest Assured or similar. Verify data integrity, security, and performance of APIs through systematic testing. Collaborate with cross-functional teams to develop test plans, test cases, and test scenarios. Execute test cases and ensure the timely identification and resolution of defects. Integrate automated tests into CI/CD pipelines to support continuous testing and deployment. Implement and optimize automated regression testing to maintain software stability. Work closely with development teams, product managers, and other stakeholders to ensure alignment with project goals and requirements. Provide timely and accurate testing status reports to project stakeholders. Champion and enforce quality assurance processes and standards throughout the software development lifecycle. Conduct code reviews and ensure the adoption of best coding practices within the testing team. Lead and mentor a team of software test engineers, providing technical guidance and support. Requirements Bachelordegree in computer science, Information Technology, or a related field. Proven experience in leading Selenium and API automation testing efforts. Expert in understanding the requirements and developing Automation frame work from scratch Strong programming skills in Java and hands-on experience with testing frameworks such as TestNG or JUnit. Extensive experience in designing and implementing automation frameworks for web applications. Solid understanding of API testing principles and tools. Experience with version control systems (e.g., Git) and build tools (e.g., Maven, Gradle). Familiarity with CI/CD tools (e.g., Jenkins, Bamboo). Excellent leadership, communication, and interpersonal skills. Ability to drive innovation and continuous improvement within the testing team.
Pune
INR 5.0 - 8.0 Lacs P.A.
Work from Office
Full Time
Position:Tableau Server Administrator Location: Balewadi Highstreet Job Type: Full-time Introduction: We are looking for an experienced and highly motivated Tableau ServerAdministrator to join our dynamic team. As a Tableau Server Administrator,you will be responsible for the installation, configuration, maintenance, andmanagement of Tableau Server environments. You will also ensure the optimalperformance, security, and uptime of Tableau Server while providingadministrative support and troubleshooting assistance to users. KeyResponsibilities: Tableau Server Administration: Install, configure, and manage Tableau Server environments (version upgrades, patches, and migrations). Configure Tableau Server for optimal performance and scalability. Set up and maintain security settings, user roles, and permissions. Manage Tableau Server environments, including user access, server monitoring, and resource management. Perform regular backups, data recovery, and disaster recovery procedures for Tableau Server. Server Monitoring Performance Optimization: Monitor and troubleshoot Tableau Server performance issues. Optimize server performance by analyzing and improving configurations. Analyze Tableau Server logs, troubleshoot issues, and implement fixes. Ensure the servers high availability, stability, and reliability. Collaboration User Support: Work closely with business analysts, developers, and other stakeholders to ensure effective Tableau usage. Provide technical support and training to end-users to maximize the utility of Tableau Server. Respond to Tableau user queries and issues in a timely manner, providing troubleshooting and solutions. Security and Compliance: Implement Tableau Server security best practices and ensure data protection. Maintain user authentication and access control policies. Ensure compliance with internal and external security standards. Integration and Automation: Support integration of Tableau Server with other data tools and systems (e.g., databases, data warehouses). Automate and schedule tasks such as data extracts, report distribution, and server monitoring. Documentation Reporting: Maintain detailed documentation for Tableau Server configurations, processes, and troubleshooting steps. Provide regular performance and usage reports to senior management. Create and maintain knowledge articles for Tableau-related procedures and guidelines. Qualifications: Education: Bachelordegree in Computer Science, Information Technology, or related field (or equivalent experience). Experience: Minimum of 5+ years of experience as a Tableau Server Administrator or in a similar role. Experience with Tableau Server installation, configuration, administration, and performance tuning. Hands-on experience with SQL and database management (e.g., MS SQL Server, MySQL, etc.). Technical Skills: Expertise in Tableau Server setup, administration, and deployment. Proficiency in configuring Tableau Server for high availability and failover. Familiarity with server and data security protocols. Experience with scripting languages (e.g., Python, PowerShell, or Bash). Strong understanding of Tableau architecture, including data sources, workbooks, views, and server infrastructure. Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills. Ability to work in a fast-paced environment and manage multiple priorities. Team player with a collaborative approach to work. Preferred Skills: Tableau Server certification (Tableau Server Certified Associate or Tableau Server Certified Professional). Experience with Tableau Desktop or Tableau Prep. Knowledge of cloud-based Tableau deployments (AWS, Azure). Familiarity with networking and firewall configuration for Tableau Server. ","
Pune
INR 12.0 - 16.0 Lacs P.A.
Work from Office
Full Time
":" Job Title: Data Governance Analyst Location: Pune (Hybrid; thrice a week in-office requirement) Company: Leading Insurance and Investments Firm We are seeking a skilled and detail-oriented Data Governance Analyst to join our Data Lakehouse program team . The ideal candidate will play a crucial role in ensuring data integrity, quality, and compliance across our organization, with a focus on Data Ownership/Stewardship , Metadata Management, Data Quality, and Reference Data Management. Key Responsibilities: 1. Metadata Management: - Review and validate metadata documents and ingestion templates populated by source system Subject Matter Experts (SMEs) and Business Analysts (BAs) - Analyze and recommend improvements to existing data dictionaries, business glossaries, access controls, data classification, and data quality requirements - Ensure metadata accuracy and completeness across all data assets 2. Data Ownership and Stewardship - Collaborate closely with Data Owners and Stewards to obtain approvals and sign-offs on data governance initiatives - Align data governance standards with business requirements and needs - Facilitate communication between technical teams and business stakeholders 3. Data Quality: - Review and enforce data quality requirements across the organization - Develop and implement data quality metrics and monitoring processes - Identify and address data quality issues in collaboration with relevant teams 4. Reference Data Management: - Review and standardize reference data and Lists of Values (LOVs) - Ensure proper maintenance and version control of reference data - Collaborate with business units to define and implement reference data standards 5. Cross-functional Collaboration: - Work closely with Business Systems Analysts, Data Architects, Change Management and Technology Governance Teams - Participate in data governance meetings and initiatives - Contribute to the development and implementation of data governance policies and procedures Preferred Qualifications: 1. Professional certifications in data governance or data management (e.g., CDMP, DGCP) 2. Experience with data lakehouse architectures and technologies 3. Familiarity with Agile methodologies and project management practices 4. Experience with data governance tools and applications (e.g. Talend, Erwin Data Modeler ) Requirements Requirements: 1. Bachelors degree in Computer Science, Information Systems, or a related field 2. 5+ years of experience in data governance, data management, or a similar role 3. Strong understanding of data governance principles, metadata management, and data quality concepts 4. Experience with data dictionaries, business glossaries, and data classification methodologies 5. Familiarity with insurance and investment industry data standards and regulations 6. Excellent analytical and problem-solving skills 7. Strong communication and interpersonal skills, with the ability to work effectively with both technical and non-technical stakeholders 8. Proficiency in data governance tools and technologies (e.g ., data catalogs, metadata repositories ) 9. Knowledge of data privacy regulations and best practices ","
Hyderabad
INR 4.0 - 7.0 Lacs P.A.
Work from Office
Full Time
Job Summary: We are seeking a skilled ETL Data Engineer to design, build, and maintain efficient and reliable ETL pipelines, ensuring seamless data integration, transformation, and delivery to support business intelligence and analytics. The ideal candidate should have hands-on experience with ETL tools like Talend , strong database knowledge, and familiarity with AWS services . Key Responsibilities: Design, develop, and optimize ETL workflows and data pipelines using Talend or similar ETL tools. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Integrate data from various sources, including databases, APIs, and cloud platforms, into data warehouses or data lakes. Create and optimize complex SQL queries for data extraction, transformation, and loading. Manage and monitor ETL processes to ensure data integrity, accuracy, and efficiency. Work with AWS services like S3, Redshift, RDS , and Glue for data storage and processing. Implement data quality checks and ensure compliance with data governance standards. Troubleshoot and resolve data discrepancies and performance issues. Document ETL processes, workflows, and technical specifications for future reference. Requirements Bachelors degree in Computer Science, Information Technology, or a related field. 4+ years of experience in ETL development, data engineering, or data warehousing. Hands-on experience with Talend or similar ETL tools (Informatica, SSIS, etc.). Proficiency in SQL and strong understanding of database concepts (relational and non-relational). Experience working in an AWS environment with services like S3, Redshift, RDS , or Glue . Strong problem-solving skills and ability to troubleshoot data-related issues. Knowledge of scripting languages like Python or Shell scripting is a plus. Good communication skills to collaborate with cross-functional teams. Benefits As per company standards. ","
Hyderabad
INR 10.0 - 14.0 Lacs P.A.
Work from Office
Full Time
We are on the lookout for a resourceful and proficient AI engineer/developer to join our forward-thinking team. The ideal candidate will not only possess a strong foundation in Python programming, advanced mathematics, and algorithms, but also have specialized knowledge in generative AI (GenAI) and large language models (LLMs). This role is pivotal in developing machine learning and deep learning models, understanding and applying various neural network architectures, and handling intricate data processing and visualization. The successful candidate will be adept in natural language processing, deploying AI and ML solutions, and upholding AI security. Experience Required: 5 to 10 years of relevant experience in AI/ML development. Key Responsibilities: Develop and refine machine learning and deep learning models. Apply expertise in neural network architectures, specifically for GenAI and LLM applications. Handle complex data processing, cleaning, and visualization tasks. Utilize natural language processing techniques for advanced AI solutions. Efficiently deploy AI/ML models in production environments, focusing on scalability and robustness. Uphold and enhance AI security measures to protect systems and data. Collaborate with cross-functional teams to integrate AI solutions, particularly GenAI and LLMs, into broader systems and applications. Stay abreast of the latest trends and advancements in AI, machine learning, GenAI, and LLMs. Requirements Requirements: Proficiency in Python programming. Advanced knowledge in mathematics and algorithm development. Experience in developing machine learning and deep learning models. Strong understanding of neural network architectures, with emphasis on GenAI and LLMs. Skilled in data processing and visualization. Experienced in natural language processing. Knowledgeable in AI/ML deployment, DevOps practices, and cloud services. In-depth understanding of AI security principles and practices. Benefits Benefits: Standard Company Benefits. ","
Hyderabad
INR 9.0 - 13.0 Lacs P.A.
Work from Office
Full Time
We are seeking a Lead ETL Data Engineer to design, develop, and optimize data pipelines, ensuring smooth data integration across our platforms. This role will lead a team of ETL developers and work closely with data analysts, engineers, and business stakeholders to drive data solutions in a cloud environment. Key Responsibilities: \u2705 ETL Development Data Pipeline Design Lead the design, development, and optimization of ETL processes using Talend (or similar ETL tools). Build, automate, and maintain scalable data pipelines for efficient data processing. Ensure data quality, consistency, and performance across ETL workflows. \u2705 Database Data Warehouse Management Work with relational and NoSQL databases , ensuring optimized SQL queries for performance. Implement data warehouse solutions (DWH) on AWS (Redshift, S3, Glue, RDS) or other cloud environments. Perform data modeling to support business intelligence and analytics. \u2705 Leadership Collaboration Guide and mentor a team of ETL developers and data engineers . Collaborate with data scientists, analysts, and business teams to understand data needs. Drive best practices in data governance, security, and compliance . \u2705 Performance Optimization Troubleshooting Monitor and troubleshoot ETL performance issues . Optimize database performance and ensure low-latency data processing. Automate error handling and data recovery strategies. Requirements Required Skills Qualifications: \u2714 10 + years of experience in ETL development and data engineering . \u2714 Expertise in ETL tools like Talend, Informatica, or Apache NiFi . \u2714 Strong proficiency in SQL and database optimization techniques . \u2714 Hands-on experience with AWS cloud services (Redshift, Glue, Lambda, S3, RDS, etc.) . \u2714 Experience with big data technologies (Spark, Hadoop, or Kafka) is a plus. \u2714 Solid understanding of data modeling, warehousing (DWH), and governance . \u2714 Excellent problem-solving and communication skills . \u2714 Experience in leading a team and driving technical best practices. Benefits As per company standards. ","
Hyderabad
INR 25.0 - 32.0 Lacs P.A.
Work from Office
Full Time
We are seeking an experienced DevOps Engineer with a strong background in container orchestration, tracing, and CI/CD tools. In this role, you will be responsible for managing Kubernetes clusters, troubleshooting container environments, and ensuring seamless application performance through effective tracing and CI/CD pipeline management. Cloud experience is a plus but not essential. Key Responsibilities: Kubernetes Container Management: Deploy, manage, and troubleshoot Kubernetes clusters using both open source Kubernetes and Red Hat OpenShift. Utilize Istio and Calico for service mesh management and network policy enforcement. Leverage strong troubleshooting skills to resolve container and Docker-related issues. Application Tracing Monitoring: Implement and maintain Datadog monitoring and tracing systems. Analyze and troubleshoot application performance issues using deep tracing insights. CI/CD Pipeline Management: Design, build, and maintain CI/CD pipelines using Jenkins, ensuring integration with Hoover libraries where applicable. Manage artifact repositories with Nexus and JFrog. Oversee SSL certificate management through Venafi to ensure secure deployments. Utilize Linux systems for continuous integration and deployment tasks. Collaboration Documentation: Work closely with development and operations teams to streamline the deployment process. Document processes, configurations, and troubleshooting steps for future reference. Requirements Qualifications: Proven experience with Kubernetes (open source and Red Hat OpenShift) and container technologies such as Docker. Strong knowledge of service meshes and network policies using Istio and Calico. Hands-on experience with Datadog for application tracing and performance monitoring. Proficiency in designing and managing CI/CD pipelines with Jenkins (experience with Hoover libraries is a plus), as well as familiarity with Nexus, JFrog, and Venafi for SSL certificate management. Solid working knowledge of Linux operating systems. Cloud platform experience is optional but will be considered an asset.
Pune
INR 22.5 - 27.5 Lacs P.A.
Work from Office
Full Time
Role: Technical Project Manager Qualification: BTech/MTech/MCA Job Location: Pune Experience: 12+ years Work Mode: Hybrid Job Overview: We are seeking a skilled and motivated Technical Project Manager to lead, plan, and oversee projects focused on data engineering and business intelligence tools. This role requires a deep understanding of data engineering processes, BI technologies, and the ability to manage cross-functional teams while ensuring the delivery of high-quality solutions that meet business requirements. Key Responsibilities: Lead and manage end-to-end project lifecycles, ensuring successful delivery of data engineering projects and BI tools initiatives. Collaborate with data engineers, data analysts, and business stakeholders to define project scope, requirements, and goals. Develop detailed project plans, timelines, and schedules, and ensure resources are allocated effectively. Track and report on project progress, addressing risks, issues, and delays promptly. Serve as the primary point of contact for business stakeholders, ensuring their needs and expectations are met while managing project priorities. Facilitate communication between technical teams and non-technical stakeholders, ensuring that all parties are informed of project status and milestones. Work closely with data engineering teams to ensure the successful design, development, and deployment of data pipelines, ETL processes, and integration of data systems. Help evaluate and select BI tools (e.g., Power BI, Tableau, Looker) and ensure their alignment with the organizations needs. Oversee the implementation of BI reporting solutions, dashboards, and analytics tools to deliver actionable insights to business users. Proactively identify project risks and issues and develop mitigation strategies. Lead problem-solving efforts to overcome roadblocks and ensure timely project delivery. Continuously evaluate and refine project management processes and methodologies to improve efficiency and effectiveness in data-driven projects. Stay up to date with industry trends and best practices in data engineering and BI tools to ensure the team uses cutting-edge technologies and techniques. Requirements Requirements Qualifications: Bachelordegree in computer science, Information Technology, Engineering, or a related field. Proven experience as a Technical Project Manager in a data engineering or BI-focused environment. Strong understanding of data engineering concepts, ETL processes, and database technologies (e.g., SQL, NoSQL, data lakes, cloud platforms). Familiarity with business intelligence tools (e.g., Power BI, Tableau, Looker, Qlik) and data visualization best practices. Experience with Agile or Scrum methodologies and tools (Jira, Confluence, etc.). Excellent leadership, communication, and interpersonal skills to work with both technical and non-technical teams. Strong organizational and problem-solving skills with the ability to manage multiple projects simultaneously. Knowledge of cloud platforms like AWS, Azure, or Google Cloud is a plus. Benefits Benefits Company Standard benefits ","
Pune
INR 4.0 - 7.0 Lacs P.A.
Work from Office
Full Time
Job Description of the Data Modeler Role The Data Modeler will work towardsdesign and implementation of new data structures to support the project teamsdelivering on ETL, Datawarehouse design , managingthe enterprise data model , the maintenance of the data , and enterprise data integration approaches. Technical Responsibilities Build and maintain out ofstandards data models to report disparate data sets in a reliable, consistentand interpretable manner. Gather, distil andharmonize data requirements and to design coherent Conceptual, logical andphysical data models and associated physical feed formats to support these dataflows. Articulate businessrequirements and build source-to-target mappings having complex ETLtransformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysisand database design -Transactional and Dimensional data modelling. Normalize/ De-normalizedata structures, introduce hierarchies and inheritance wherever required inexisting/ new data models. Develop and implement data warehouse projects independently. Work with data consumersand data suppliers to understand detailed requirements, and to proposestandardized data models. Contribute to improvingthe Data Management data models. Be an influencer to present and facilitate discussions tounderstand business requirements and develop dimension data models based onthese capabilities and industry best practices. Requirements Extensive practicalexperience in Information Technology and software development projects of with atleast 8+ years of experience in designing Operational data store data warehouse. Extensive experience inany of Data Modelling tools Erwin/ SAP power designer. Strong understanding ofETL and data warehouse concepts processes and best practices. Proficient in Data Modellingincluding conceptual, logical and physical data modelling for both OLTP andOLAP. Ability to write complexSQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination ofsolid business knowledge and technical expertise with strong communicationskills. Demonstrate excellent analytical and logical thinking. Good verbal written communication skills and Ability to workindependently as well as in a team environment providing structure in ambiguoussituation. Good to have Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding ofMaster Data Management, Data Quality and Data Governance. Basicunderstanding of datavisualization tools like SAS VA, Tableau Goodunderstanding of implementing architecting data solutions using the informatica, SQL server/Oracle ","
Hyderabad
INR 30.0 - 35.0 Lacs P.A.
Work from Office
Full Time
":" Job Description: We are looking for a highly skilled Senior Data Scientist with 39 years of experience specializing in Python, Large Language Models (LLMs), NLP, Machine Learning, and Generative AI . The ideal candidate will have a deep understanding of building intelligent systems using modern AI frameworks and deploying them into scalable, production-grade environments. You will work closely with cross-functional teams to build innovative AI solutions that deliver real business value. Responsibilities: Design, develop, and deploy ML/NLP solutions using Python and state-of-the-art AI frameworks. Apply LLMs and Generative AI techniques to solve real-world problems. Build, train, fine-tune, and evaluate models for NLP and GenAI tasks. Collaborate with data engineers, MLOps, and product teams to operationalize models. Contribute to the development of scalable AI services and applications. Analyze large datasets to extract insights and support model development. Maintain clean, modular, and version-controlled code using Git. Requirements Must-Have Skills: 310 years of hands-on experience with Python for data science and ML applications. Strong expertise in Machine Learning algorithms and model development. Proficient in Natural Language Processing (NLP) and text analytics. Experience with Large Language Models (LLMs) and Generative AI frameworks (e.g., LangChain, Hugging Face Transformers). Familiarity with model deployment and real-world application integration. Experience with version control systems like Git . Good to Have: Experience with PySpark for distributed data processing. Exposure to MLOps practices and model lifecycle management. Familiarity with cloud platforms such as AWS, GCP, or Azure. Knowledge of vector databases (e.g., FAISS, Pinecone) and embeddings. Educational Qualification: Bacheloror Masterdegree in Computer Science, Data Science, Statistics, or a related field. Benefits Work with cutting-edge technologies in a collaborative and forward-thinking environment. Opportunities for continuous learning, skill development, and career growth. Exposure to high-impact projects in AI and data science. ","
Hyderabad
INR 3.0 - 6.0 Lacs P.A.
Work from Office
Full Time
We are seeking a highlyskilled and experienced Senior Data Engineer to lead the end-to-end developmentof complex models for compliance and supervision. The ideal candidate will havedeep expertise in cloud-based infrastructure, ETL pipeline development, andfinancial domains, with a strong focus on creating robust, scalable, andefficient solutions. Key Responsibilities: -ModelDevelopment: Lead the development of advanced models using AWS services such asEMR, Glue, and Glue Notebooks. -CloudInfrastructure: Design, build, and optimize scalable cloud infrastructuresolutions with a minimum of 5 years of experience. -ETL PipelineDevelopment: Create, manage, and optimize ETL pipelines using PySpark forlarge-scale data processing. -CI/CDImplementation: Build and maintain CI/CD pipelines for deploying andmaintaining cloud-based applications. -Data Analysis:Perform detailed data analysis and deliver actionable insights to stakeholders. -Collaboration:Work closely with cross-functional teams to understand requirements, presentsolutions, and ensure alignment with business goals. -AgileMethodology: Operate effectively in agile or hybrid agile environments,delivering high-quality results within tight deadlines. -FrameworkDevelopment: Enhance and expand existing frameworks and capabilities to supportevolving business needs. -Documentation andCommunication: Create clear documentation and present technical solutions toboth technical and non-technical audiences. Requirements Required Qualifications: -05+ years ofexperience with Python programming. -5+ years ofexperience in cloud infrastructure, particularly AWS. -3+ years ofexperience with PySpark, including usage with EMR or Glue Notebooks. -3+ years ofexperience with Apache Airflow for workflow orchestration. -Solid experiencewith data analysis in fast-paced environments. -Strongunderstanding of capital markets, financial systems, or prior experience in thefinancial domain is a must. -Proficiency withcloud-native technologies and frameworks. -Familiarity withCI/CD practices and tools like Jenkins, GitLab CI/CD, or AWS CodePipeline. -Experience withnotebooks (e.g., Jupyter, Glue Notebooks) for interactive development. -Excellentproblem-solving skills and ability to handle complex technical challenges. -Strongcommunication and interpersonal skills for collaboration across teams andpresenting solutions to diverse audiences. -Ability to thrivein a fast-paced, dynamic environment. Benefits Standard Company Benefits ","
Hyderabad
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Job Overview: We seek a highly skilled Java Full Stack Developer who is comfortable with frontend and backend development. The ideal candidate will be responsible for developing and designing frontend web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for web design features, among other duties. The Java Full Stack Developer will be required to see out a project from conception to final product, requiring good organizational skills and attention to detail. Key Responsibilities: Frontend Development: Design and develop user-facing web applications using modern frontend languages like HTML, CSS, and JavaScript and frameworks like React.js, Angular, or Vue.js. Backend Development: Build and maintain server-side application logic using languages such as Node.js, Python, Ruby, Java, or PHP, and manage database interactions with MySQL, PostgreSQL, MongoDB, or other database systems. API Development and Integration: Develop and integrate RESTful APIs to connect frontend and backend components, ensuring smooth data flow and communication between different parts of the application. Database Management: Design, implement, and manage databases, ensuring data integrity, security, and optimal performance. Version Control and Collaboration: Use Git and other version control systems to track code changes and collaborate with other team developers. Deployment and DevOps: Automate deployment processes, manage cloud infrastructure, and ensure the scalability and reliability of applications through CI/CD pipelines. Security Implementation: Implement security best practices to protect the application from vulnerabilities, including authentication, authorization, and data encryption . Cross-Platform Optimization: Ensure the application is responsive and optimized for different devices, platforms, and browsers. Troubleshooting and Debugging: Identify, diagnose, and fix bugs and performance issues in the application, ensuring a smooth user experience. Collaboration and Communication: Work closely with product managers, designers, and other stakeholders to understand requirements and deliver solutions that meet business needs. Continuous Learning: Stay updated with the latest technologies, frameworks, and industry trends to improve development practices continuously. Requirements Technical Skills: Proficiency in frontend technologies like HTML, CSS, JavaScript, and frameworks like React.js, Angular, or Vue.js. Strong backend development experience with Node.js, Python, Java, or similar languages. Hands-on experience with databases like MySQL, PostgreSQL, MongoDB, or similar. Familiarity with version control systems, notably Git. Experience with cloud services like AWS, Azure, or Google Cloud. Knowledge of CI/CD pipelines and DevOps practices. Understanding of security principles and how to apply them to web applications. Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication skills and the ability to work collaboratively in a team environment. Ability to manage multiple tasks and projects simultaneously. Eagerness to learn new technologies and improve existing skills.
Pune
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
We are seeking a highly skilled and experienced Senior Data Engineer to lead the end-to-end development of complex models for compliance and supervision. The ideal candidate will have deep expertise in cloud-based infrastructure, ETL pipeline development, and financial domains, with a strong focus on creating robust, scalable, and efficient solutions. Key Responsibilities: -Model Development: Lead the development of advanced models using AWS services such as EMR, Glue, and Glue Notebooks. -Cloud Infrastructure: Design, build, and optimize scalable cloud infrastructure solutions with a minimum of 5 years of experience. -ETL Pipeline Development: Create, manage, and optimize ETL pipelines using PySpark for large-scale data processing. -CI/CD Implementation: Build and maintain CI/CD pipelines for deploying and maintaining cloud-based applications. -Data Analysis: Perform detailed data analysis and deliver actionable insights to stakeholders. -Collaboration: Work closely with cross-functional teams to understand requirements, present solutions, and ensure alignment with business goals. -Agile Methodology: Operate effectively in agile or hybrid agile environments, delivering high-quality results within tight deadlines. -Framework Development: Enhance and expand existing frameworks and capabilities to support evolving business needs. -Documentation and Communication: Create clear documentation and present technical solutions to both technical and non-technical audiences. Requirements Requirements Required Qualifications: -05+ years of experience with Python programming. -5+ years of experience in cloud infrastructure, particularly AWS. -3+ years of experience with PySpark, including usage with EMR or Glue Notebooks. -3+ years of experience with Apache Airflow for workflow orchestration. -Solid experience with data analysis in fast-paced environments. -Strong understanding of capital markets, financial systems, or prior experience in the financial domain is a must. -Proficiency with cloud-native technologies and frameworks. -Familiarity with CI/CD practices and tools like Jenkins, GitLab CI/CD, or AWS CodePipeline. -Experience with notebooks (e.g., Jupyter, Glue Notebooks) for interactive development. -Excellent problem-solving skills and ability to handle complex technical challenges. -Strong communication and interpersonal skills for collaboration across teams and presenting solutions to diverse audiences. -Ability to thrive in a fast-paced, dynamic environment. Benefits Benefits Standard Company Benefits ","
Hyderabad
INR 15.0 - 30.0 Lacs P.A.
Work from Office
Full Time
We are seeking a skilled ETL Data Engineer to design, build, and maintain efficient and reliable ETL pipelines, ensuring seamless data integration, transformation, and delivery to support business intelligence and analytics. The ideal candidate should have hands-on experience with ETL tools like Talend , strong database knowledge, and familiarity with AWS services . Key Responsibilities: Design, develop, and optimize ETL workflows and data pipelines using Talend or similar ETL tools. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Integrate data from various sources, including databases, APIs, and cloud platforms, into data warehouses or data lakes. Create and optimize complex SQL queries for data extraction, transformation, and loading. Manage and monitor ETL processes to ensure data integrity, accuracy, and efficiency. Work with AWS services like S3, Redshift, RDS , and Glue for data storage and processing. Implement data quality checks and ensure compliance with data governance standards. Troubleshoot and resolve data discrepancies and performance issues. Document ETL processes, workflows, and technical specifications for future reference. Requirements Bachelor's degree in Computer Science, Information Technology, or a related field. 4+ years of experience in ETL development, data engineering, or data warehousing. Hands-on experience with Talend or similar ETL tools (Informatica, SSIS, etc.). Proficiency in SQL and strong understanding of database concepts (relational and non-relational). Experience working in an AWS environment with services like S3, Redshift, RDS , or Glue . Strong problem-solving skills and ability to troubleshoot data-related issues. Knowledge of scripting languages like Python or Shell scripting is a plus. Good communication skills to collaborate with cross-functional teams.
Gurugram
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Senior Specialist Cloud Engineer - ContactCentre Innovation & GenAI RoleSummary: We areseeking an experienced and highly skilled Senior Specialist Cloud Engineer tojoin our innovative team. In this role, you will be responsible for designing,implementing, and maintaining cloud-based solutions using cutting-edgetechnologies. You will play a crucial role in optimizing our cloudinfrastructure, improving system performance, and ensuring the scalability andreliability of our applications. What youwill do: (Roles & Responsibilities) - Design and implement complexcloud-based solutions using AWS services - Design and optimize database schemasand queries, particularly with DynamoDB - Write, test, and maintain high-quality Python code for cloud-based applications - Able to work on Amazon Connect and integrate Amazon services - Collaborate with cross-functional teamsto identify and implement cloud-based solutions - Ensure security, compliance, and bestpractices in cloud infrastructure - Troubleshoot and resolve complextechnical issues in cloud environments - Mentor junior engineers and contributeto the teams technical growth - Stay up-to-date with the latest cloudtechnologies and industry trends Requirements What you need to succeed: (MUST Haves) - Bachelors degree in Computer Science,Engineering, or a related field - 5-9 years of experience in cloudengineering, with a strong focus on AWS - Extensive experience with Pythonprogramming and software development - Strong knowledge of database systems,particularly DynamoDB - Hands On experience in Amazon Connect - Excellent problem-solving and analyticalskills - Strong communication and collaborationabilities Ideal Candidate will also have: -Experience with containerization technologies (e.g., Docker, Kubernetes) -Knowledge of CI/CD pipelines and DevOps practices -Familiarity with serverless architectures and microservices -Experience with data analytics and big data technologies -Understanding of machine learning and AI concepts -Contributions to open-source projects or technical communities - AWS certifications (e.g., SolutionsArchitect, DevOps Engineer) are a plus -Experience mentoring junior engineers or leading small teams -Strong project management skills and ability to manage multiple priorities If you are passionate about cloud technologies,have a proven track record of delivering innovative solutions, and thrive in acollaborative environment, we want to hear from you. Join our team and helpshape the future of cloud computing! Benefits As per company standards. ","
Hyderabad
INR 2.0 - 5.0 Lacs P.A.
Work from Office
Full Time
Installing, configuring, and troubleshooting of all Windows and mac OS. Securing the network by installing troubleshooting Antivirus related issues regularly updating the Antivirus Configuring and Troubleshooting Local Network Printers. Resolving hardware related issues in printers and other peripherals. Providing the admin rights, Remote desktop access, file and folder access to the users as per the request. Trouble shooting all end-to-end technical problems through the remote tools. Monitoring the compliance status of all the desktops/servers in terms of patch/DATstatus. Troubleshooting issues on Office365 and escalate to proper team Identifying and solving issues on Microsoft products (EXCEL, POWEPOINT, WORD, TEAMS) Troubleshooting issues on Citrix connections, Client VPN. Add devices to AzureAD, create, deploy and managing Intune MDM. Create, deploy, manage the app protection policies, device configuration policies from Intune endpoint manager. Create and manage security firewall, Switches, and ILL. Manage and update McAfee web controls and firewall rules. Maintain and monitor CCTV camera, Access control eSSL. Identify the causes of networking problems, using diagnostic testing software and equipment. Resolve IT tickets regarding computer software, hardware, and application issues on time. Set up equipment for employee use, performing or ensuring proper installation of cables, operating systems, or appropriate software. Install and perform minor repairs to hardware, software, or peripheral equipment. Requirements Good Experience in System administration Technical Support Experience Experience in ITIL process Experience in RIM (Remote Infrastructure Mgmt.) Good knowledge in Virtualization and cloud concepts with VMware and/or Open stack Excellent communication skills
FIND ON MAP
Company Reviews
Anudeep Fitzgerald
9 months ago
I've been with data economy for 2yrs (still going ), and it's been an incredible journey. The work environment fosters growth and innovation, with sup...
raviteja bavanari
9 months ago
Working with DATAECONOMY is a thrilling and enjoyable experience so far in this long journey of mine. Learning here is very high and competitive. Ever...
suryavamsi vempati
10 months ago
Working at Data Economy has been a truly rewarding experience. I’ve joined this company as a fresher, through out working here company encourages it’s...
vivek reddy
a year ago
At Dataeconomy, we prioritize creating an familial atmosphere where every member feels valued and supported. Our work environment is designed to be fa...
Rohith kumar Gaddi 007
a year ago
Working with Data Economy is my Pleasure. As a fresher I have joined the Company and there is so much I have upskilled my career. It has the flexibili...
ankit gaurav singh
a year ago
DATAECONOMY fosters a friendly and welcoming environment, making it a pleasure to work here. Engaging projects keep you motivated and excited about yo...
Amol Bhosale
a year ago
I am working from last one and half year in Data Economy. I seems that organization has support to learn new cutting edge technology that will help u...
nikhil tyagi
a year ago
I believe data economy is the best place for freshers. It has a great work culture and flexibility for work life balance. We get opportunity to put ou...
suchit hemgire
a year ago
"I'm proud to be a part of "Data Economy" It's an amazing workplace with a fantastic team. The positive work environment and commitment to excellence...
pavan kumar
a year ago
I've thrived at DATAECONOMY thanks to its unwavering commitment to employee development and fostering a genuinely friendly team atmosphere. Equally im...
Chukka Dileepbabu
a year ago
In my time at DATAECONOMY, I've discovered a workplace that goes beyond expectations. The diverse culture, commitment to learning, and inspiring value...
Dipesh Thakur
a year ago
Great company for freshers to learn and explore new technologies. Intelligent and smart seniors gives support to the freshers for their work and learn...
Ajinkya Kulkarni
a year ago
Proud to be part of Data Economy. DataEconomy is a game-changer in the data sector. Work culture is good. Personal & professional balance is good
Shubham Jori
11 months ago
Dataeconomy is a excellent organisation to work into with ample opportunities in various technologies and also the work environment is awesome.Highly ...
Pradhyumn Gawade
9 months ago
Amazing Work Culture with appropriate work and human resource management. Opportunities to grow with proper training under company's wing
Krushna Bodake
10 months ago
Working with Data Economy is my Pleasure. I have upskilled my career. It has the flexibility for Work Life Balance. There are people with huge knowle...
Pooja Shelke
9 months ago
Great experience with Data Economy ! Team is highly supportive and great projects to work on...
KRUSHNA BODAKE
a year ago
The work life balance is good as well as current employees and colleagues supportive and involving.
Dinesh choudhary
a year ago
It's very good experience in Company.Good work culture and good professional and personal life balance.Good Career opportunity.
swapnil powar
a year ago
Very good place to work.Better working culture. Work flexibility with better cutting edge technologies.
Saurabh Dhoke
a year ago
It is a really good organization, it's growing fast now, management is really good and everyone is supportive here
ayush jaiswal
11 months ago
Great place to work. Members are very supportive. Good learning environment
ayush katiyar
9 months ago
Great place to work, exposure to variety of projects and technologies.
ajay umbarkar
10 months ago
Excellent work culture.More learning opportunities.Supportive management.
Rajesh Soora
a year ago
Great place to work. Good Work culture.
Aniket Adwankar
9 months ago
Great place to work
Narsimha Rao
a year ago
Good place to work.
Prakhar Agrawal
a year ago
Great place to work.
Lok Sandeep
a year ago
Personally liked the culture and employee friendly policies. Thanks to Human Resources Team (Sailaja Saranu, Enosh Digumarthi and Shruthi Mamidi)
Chaitanya
2 weeks ago
Venkatesh N
a month ago
SR Desu
7 months ago
Ravi Sankar
7 months ago
Nune Venkata Sreenivasulu
8 months ago
rasik pawar
a year ago
Ram Kathela
a year ago
Aravind Reddy (Aru)
a year ago
Shubham Undirwade
a year ago
Harsh Kumar
a year ago
Ravi Kumar Javvaji
a year ago
My Connections Data Economy
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.