Title: Technical Delivery Lead / Manager (Night Shift) Work Location: Hi-tech City (Cyber Gateways); Hyderabad Work Time: US CST Time zone Experience: 10+ Years Responsibilities: Understand the customers technical requirements and develop a staffing plan and job descriptions for technical resources Work with recruiting team in screening and interviewing candidates to fill the IT applications and infrastructure positions Conduct online technical evaluations of the candidates Submit technical evaluation report to the account managers Develop Statement of Work and Change Orders Manage relationships with US customers and US Employees Develop monthly and quarterly status reports Develop proposals for new engagements Work in US CST time zone Requirements: Must have 5+ years of experience managing technical teams Must have experience managing US Customer relations Must have worked as a Software Developer or Infrastructure Engineer earlier in their career Must have a good understanding of both applications and infrastructure technologies Strong IT background with hands-on experience in various Technologies Developing SOWs and technical proposals is a plus Good verbal and written communication skills Interested Candidates, please share your updated resume with Sam.kumar@comtecinfo.com or call me at +91 83176 98407 (Between 11 AM To 11 PM IST Only) Thank You Sam |Sr Manager
Job description Role: Data Analyst (Maximo Supply Chain) Location: Hyderabad - CyberGateways (Hybrid -3 Days a week) Work Timing : US CST 8 AM to 5 PM CST ( India 6:30 PM to 3:30 AM ) Interested candidates please share your resume with " sam.kumar@comtecinfo.com " We are seeking a highly skilled Data Analyst with knowledge of IBM Maximo Supply Chain tables. The ideal candidate will have hands-on experience extracting, transforming, and analyzing data from Maximo using SQL and Python , along with a strong understanding of Maximo Supply Chain data and tables . This role will focus on turning raw data into actionable insights to support decision-making across procurement, inventory management, vendor performance, and supply chain operations. Responsibilities Extract and manipulate data from IBM Maximo databases using SQL and Python. Develop and maintain queries, scripts, and automated workflows to support reporting and analysis needs. Analyze Maximo Supply Chain data (Materials, Inventory, Services, Invoices, Spend) to provide insights and recommendations. Build dashboards, KPIs, and reports to monitor supply chain performance and trends. Work with supply chain, procurement, and IT teams to understand business requirements and translate them into data-driven solutions. Document data models, queries, and processes for knowledge sharing and sustainability. Required Qualifications 5+ years of experience as a Data Analyst or similar role. Proven expertise with SQL and Python Hands-on experience extracting and analyzing Maximo Supply Chain data , including but not limited to Materials, Inventory, Services, Invoices, and Spend. Strong knowledge of relational database concepts and data modeling. Experience in automating ETL workflows or data pipelines. Ability to communicate technical concepts to non-technical stakeholders. Strong problem-solving skills and attention to detail.
Role:DataEngineer Location: Hyderabad - CyberGateways (Hybrid -3 Days a week) Work Timing : US CST 8 AM to 5 PM CST ( India 6:30 PM to 3:30 AM ) Interested candidates please share your resume with " sam.kumar@comtecinfo.com " We are seeking a skilled AWS Data Engineer who has experience working with Python, PySpark, lambda, Airflow, and Snowflake. Responsibilities: Design, build, and optimize ETLs using Python, PySpark, lambda, Airflow and other AWS services. Create SQL queries to segment, manipulate, and formatdata. Build automations to ingest, transfer, move, upload, and manipulatedata. Build or maintain data ingestion pipelines that move data from source systems into Snowflake. Create and manage data models to ensure data integrity and facilitate efficient data analysis. Implement and maintain data security and compliance measures, including access controls, encryption, and data masking. Ensure data quality, accuracy, and consistency through data validation, cleansing, and monitoring. Design and maintain ETL/ELT pipelines to ingest data into Amazon Redshift for analytics and reporting. Requirements: Minimum 5 years of experience as Data Engineer. 3+ years of Python, PySpark, and Lambda. Must have experience with Airflow and Snowflake. Advanced SQL query development proficiency Understanding of data modelling principles and techniques. Knowledge of data security best practices and compliance requirements.