Jobs
Interviews

Tech T7 Innovations

5 Job openings at Tech T7 Innovations
Avaloq Developer Pune,Maharashtra,India 0 years Not disclosed On-site Full Time

Job Title: Avaloq Developer Location: Pune, India Client: HSBC Open Positions: 5 Job Summary: We are looking for experienced Avaloq Developers to join our team supporting a prestigious banking client, HSBC. The ideal candidates will have a strong background in Avaloq development, with deep understanding of private banking, capital markets, and investment domains. You will be involved in designing, developing, and delivering high-quality solutions on the Avaloq platform, working in an Agile/DevOps environment. Key Responsibilities: Analyze user requirements and deliver robust solutions using Avaloq parameterization, Unix, and shell scripting. Design and develop scalable and maintainable Avaloq-based applications aligned with business goals. Ensure high-quality code delivery through best practices in development, testing, and documentation. Participate in all phases of the software development lifecycle including analysis, design, coding, testing, deployment, and support. Write and execute unit and integration test cases. Collaborate with business analysts, architects, QA, and other stakeholders to validate technical and business requirements. Support batch processes, EOD activities, and interface/integration development. Provide support for production issues and actively contribute to problem-solving and debugging efforts. Prepare and maintain comprehensive technical documentation. Work effectively within an Agile/DevOps framework. Support POD members and address specific issues/queries arising from project execution. Required Skills & Experience: Strong hands-on experience with Avaloq Development and Avaloq platform . Solid understanding of Private Banking , Capital Markets , and Investment domain. Knowledge of Avaloq modules like STEX , Corporate Actions , Payments , Settlements , Credit , Finance , etc. Experience with Avaloq reporting , interfaces , and batch/EOD processes . Proficiency in PL/SQL , Unix , and shell scripting . Ability to perform parameterization within Avaloq. Strong analytical, debugging, and problem-solving skills. Sound knowledge of Agile methodologies and DevOps practices . Excellent written and verbal communication skills. Experience in preparing functional/technical documentation and coordinating with stakeholders. Preferred Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. Avaloq certification (nice to have). Experience in working with global teams or banking clients. Join us to be a part of a fast-paced, collaborative environment where your expertise in Avaloq will contribute to critical banking solutions. Show more Show less

Lead Java Developer Pune,Maharashtra,India 8 - 13 years Not disclosed On-site Full Time

Job Title: Lead Java Developer Experience Level: 8-13 years Work mode: onsite Location: Pune Job Type: Fulltime Company Overview Tech T7 Innovations is seeking a highly skilled and experienced Lead Java Developer to join our growing team in Pune. If you are passionate about building scalable systems, leading high-performing teams, and working on cutting-edge micro services-based architectures, we want to hear from you. Key Responsibilities: Lead the design, development, and deployment of Java-based enterprise applications. Architect and implement microservices using Spring Boot and RESTful APIs. Collaborate with cross-functional teams to define, design, and ship new features. Ensure system reliability, scalability, and high availability. Mentor and guide junior developers, driving best practices in coding, testing, and DevOps. Participate in code reviews and ensure code quality using JUnit and Mockito. Integrate with messaging platforms using JMS/MQ. Work closely with DevOps teams to maintain CI/CD pipelines and deployment processes. Contribute to project planning, estimation, and technical documentation. Must-Have Skills: 8–13 years of hands-on experience in software development. Expertise in Java , OOP principles , Spring Boot , and RESTful web services . Solid understanding of Microservices Architecture . Strong database skills (SQL/NoSQL). Experience with messaging systems like JMS/MQ . Proficient in unit testing with JUnit and Mockito . Working knowledge of CI/CD tools and DevOps practices. Excellent communication and interpersonal skills. Understanding of high availability and observability in micro services-based systems. Good-to-Have Skills: Experience with NoSQL databases like MongoDB . Exposure to cloud platforms and Ansible . Knowledge of Finance domain is a plus. Show more Show less

Human Resources Information System Consultant Gurugram,Haryana,India 4 - 5 years None Not disclosed On-site Full Time

Job Title: Analyst – Workday- Core HCM Location: Hybrid ( GURUGRAM ) Job Type: Full-Time Experience Level: 4 to 5 years Joining: Immediate Education/Qualification: BE/BTECH/ME/MTECH Company Description: Tech T7 Innovations is a company that provides IT solutions to clients worldwide. The team consists of highly skilled and experienced professionals who are passionate about IT. Tech T7 Innovations offers a wide range of IT services, including software development, web design, cloud computing, cybersecurity, data engineering, data science and machine learning. The company is committed to staying up-to-date with the latest technologies and best practices to deliver the best solutions to their clients. Job Description: Role and Responsibilities Seeking a Workday SME to configure, manage and support Core HCM Workday processes (Core HR, Org Management, Compensation) and ensure that these processes are designed per business requirements and operate effectively within Workday. · Work with the HR business to design and document e2e processes to be configured in Workday. · Create and monitor support cases using ticketing tools, create relevant searches and run reports. · Facilitating Backlog review meetings with various work streams leads to prioritize requests for the upcoming sprint planning meetings.

Senior Data Engineer Gurugram,Haryana,India 6 years None Not disclosed On-site Full Time

Job Title: Senior Google Cloud Platform (GCP) Data Engineer Location: Hybrid (Bengaluru, India) Job Type: Full-Time Experience Required: Minimum 6 Years Joining: Immediate or within 1 week About the Company: Tech T7 Innovations is a global IT solutions provider known for delivering cutting-edge technology services to enterprises across various domains. With a team of seasoned professionals, we specialize in software development, cloud computing, data engineering, machine learning, and cybersecurity. Our focus is on leveraging the latest technologies and best practices to create scalable, reliable, and secure solutions for our clients. Job Summary: We are seeking a highly skilled Senior GCP Data Engineer with over 6 years of experience in data engineering and extensive hands-on expertise in Google Cloud Platform (GCP). The ideal candidate must have a strong foundation in GCS, BigQuery, Apache Airflow/Composer, and Python, with a demonstrated ability to design and implement robust, scalable data pipelines in a cloud environment. Roles and Responsibilities: Design, develop, and deploy scalable and secure data pipelines using Google Cloud Platform components including GCS, BigQuery, and Airflow. Develop and manage robust ETL/ELT workflows using Python and integrate with orchestration tools such as Apache Airflow or Cloud Composer. Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver reliable and efficient data solutions. Optimize BigQuery performance using best practices such as partitioning, clustering, schema design , and query tuning . Manage, monitor, and maintain data lake and data warehouse environments with high availability and integrity. Automate pipeline monitoring, error handling, and alerting mechanisms to ensure seamless and reliable data delivery . Contribute to architecture decisions involving data modeling, data flow, and integration strategies in a cloud-native environment. Ensure compliance with data governance , privacy, and security policies as per enterprise and regulatory standards. Mentor junior engineers and drive best practices in cloud engineering and data operations . Mandatory Skills: Google Cloud Platform (GCP): In-depth hands-on experience with GCS, BigQuery, IAM, and Cloud Functions. BigQuery (BQ): Expertise in large-scale analytics, schema optimization, and data modeling. Google Cloud Storage (GCS): Strong understanding of data lifecycle management, access controls, and best practices. Apache Airflow / Cloud Composer: Proficiency in writing and managing complex DAGs for data orchestration. Python Programming: Advanced skills in automation, API integration, and data processing using libraries like Pandas, PySpark, etc. Preferred Qualifications: Experience with CI/CD pipelines for data infrastructure and workflows. Exposure to other GCP services like Dataflow , Pub/Sub , and Cloud Functions . Familiarity with Infrastructure as Code (IaC) tools such as Terraform . Strong communication and analytical skills for problem-solving and stakeholder engagement. GCP Certifications (e.g., Professional Data Engineer) will be a significant advantage

Senior GCP Data Engineer Greater Hyderabad Area 6 years None Not disclosed On-site Full Time

Job Title: Senior Google Cloud Platform (GCP) Data Engineer Location: Hybrid (Pune , hyderabad) Job Type: Full-Time Experience Required: Minimum 6+ Years Joining: Immediate or within 1 week About the Company: Tech T7 Innovations is a global IT solutions provider known for delivering cutting-edge technology services to enterprises across various domains. With a team of seasoned professionals, we specialize in software development, cloud computing, data engineering, machine learning, and cybersecurity. Our focus is on leveraging the latest technologies and best practices to create scalable, reliable, and secure solutions for our clients. Job Summary: We are seeking a highly skilled Senior GCP Data Engineer with over 11 years of experience in data engineering and extensive hands-on expertise in Google Cloud Platform (GCP). The ideal candidate must have a strong foundation in GCS, BigQuery, Apache Airflow/Composer, and Python, with a demonstrated ability to design and implement robust, scalable data pipelines in a cloud environment. Roles and Responsibilities: Design, develop, and deploy scalable and secure data pipelines using Google Cloud Platform components including GCS, BigQuery, and Airflow. Develop and manage robust ETL/ELT workflows using Python and integrate with orchestration tools such as Apache Airflow or Cloud Composer. Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver reliable and efficient data solutions. Optimize BigQuery performance using best practices such as partitioning, clustering, schema design , and query tuning . Manage, monitor, and maintain data lake and data warehouse environments with high availability and integrity. Automate pipeline monitoring, error handling, and alerting mechanisms to ensure seamless and reliable data delivery . Contribute to architecture decisions involving data modeling, data flow, and integration strategies in a cloud-native environment. Ensure compliance with data governance , privacy, and security policies as per enterprise and regulatory standards. Mentor junior engineers and drive best practices in cloud engineering and data operations . Mandatory Skills: Google Cloud Platform (GCP): In-depth hands-on experience with GCS, BigQuery, IAM, and Cloud Functions. BigQuery (BQ): Expertise in large-scale analytics, schema optimization, and data modeling. Google Cloud Storage (GCS): Strong understanding of data lifecycle management, access controls, and best practices. Apache Airflow / Cloud Composer: Proficiency in writing and managing complex DAGs for data orchestration. Python Programming: Advanced skills in automation, API integration, and data processing using libraries like Pandas, PySpark, etc. Preferred Qualifications: Experience with CI/CD pipelines for data infrastructure and workflows. Exposure to other GCP services like Dataflow , Pub/Sub , and Cloud Functions . Familiarity with Infrastructure as Code (IaC) tools such as Terraform . Strong communication and analytical skills for problem-solving and stakeholder engagement. GCP Certifications (e.g., Professional Data Engineer) will be a significant advantage.