Diligent Tec, Inc

4 Job openings at Diligent Tec, Inc
US IT Bench Sales Recruiter hyderabad,telangana,india 1 years None Not disclosed On-site Contractual

Job Description Title- Bench Sales Recruiter Experience- 1 - 3 Years EST timings Onsite in Hyderabad, TS, INDIA. *Excellent verbal and written communication skills in English is mandatory. Fresher's with good verbal and written communication skills in English are also welcome. Duties and responsibilities: Identify and source new clients Develop and maintain relationships with clients Market the agency's bench of consultants to clients Screen and qualify candidates Conduct interviews and reference checks Negotiate and close placements Track and report on sales activity Qualifications : Bachelor's degree in any stream. 1+ years of experience in sales recruiting Strong communication and interpersonal skills Ability to work independently and as part of a team Proficiency in Microsoft Office Suite (MS Excel, Word)

US IT Bench Sales Recruiter hyderabad,telangana,india 1 - 3 years INR Not disclosed On-site Full Time

Job Description Title- Bench Sales Recruiter Experience- 1 - 3 Years EST timings Onsite in Hyderabad, TS, INDIA. *Excellent verbal and written communication skills in English is mandatory. Fresher&aposs with good verbal and written communication skills in English are also welcome. Duties and responsibilities: Identify and source new clients Develop and maintain relationships with clients Market the agency&aposs bench of consultants to clients Screen and qualify candidates Conduct interviews and reference checks Negotiate and close placements Track and report on sales activity Qualifications : Bachelor&aposs degree in any stream. 1+ years of experience in sales recruiting Strong communication and interpersonal skills Ability to work independently and as part of a team Proficiency in Microsoft Office Suite (MS Excel, Word) Show more Show less

Dell Boomi Developer chandigarh,chandigarh,india 8 years None Not disclosed Remote Contractual

Title: Dell Boomi Developer Vacancy - 2 positions Work Mode - Remote (Pan- INDIA) Duration: 12+ months contract Minimum Experience - 8 Years to Max 15 Years Boomi Senior Developer to design, develop and maintain Integration solution using Boomi Atmosphere who have Deep expertise in Integration Architecture, API Management and Cloud-based solutions · Design and Develop Integration solutions using Boomi Atmosphere · Design and manage REST/SOAP API in Boomi Platform · Handle data mapping, transformation for various formats (XML, JSON, Flat File) · Identify bottlenecks and optimize integration process for performance and reliability · Implement Hybrid integrations between On-Premise and Cloud application like Salesforce, Workday, Coupa, JIRA, S4/Hana, MS SharePoint · Monitor, debug and resolve integration issues efficiently · Create and maintain technical documentation and enforce best practices in integration design · Work with cross-functional teams like BSA, other developers and QA Skillset - Must have · Minimum 10 years of Middleware / iPaaS experience with minimum 5yrs in Boomi Platform · Strong knowledge of Boomi Atoms, Molecules , Atom Queue and API Management · Experience in setting up integrations from scratch · Experience in integrating Cloud and Enterprise applications · Strong understanding of Authentication mechanisms · Experience in setting up CI/CD pipelines and Dev Ops · Excellent problem-solving and debussing skills · Boomi Developer Architect certifications · Knowledge of Groovy, Java · Understanding of Microservice Architecture

Senior GCP Data Engineer chennai,tamil nadu,india 5 years None Not disclosed Remote Contractual

Senior Data Engineer - GCP Native Platform Location: Fully Remote (Pan - India) Role Summary Seeking an experienced Senior Data Engineer to design and build scalable data pipelines on Google Cloud Platform. You will implement data solutions using BigQuery, Dataflow, Cloud Composer, and modern engineering practices while mentoring junior team members and driving technical excellence. Key Responsibilities Pipeline Development & Engineering Build and optimize batch and streaming data pipelines using Dataflow (Apache Beam). Develop and maintain Airflow DAGs in Cloud Composer for workflow orchestration. Implement ELT processes using Dataform (SQLX) for transformations, testing, and documentation. Design and maintain BigQuery datasets following medallion architecture (Bronze/Silver/Gold). Create reusable pipeline templates and frameworks to accelerate development. Data Platform Implementation BigQuery: Optimized schemas, partitioning strategies, clustering, materialized views, and performance tuning. Dataflow: Build Apache Beam pipelines in Python and/or Java for complex transformations. Cloud Composer: Manage Airflow workflows with dependency management and scheduling. Dataform: Author SQLX transformations, tests, and documentation. Pub/Sub: Implement event-driven ingestion and CDC patterns. Cloud Storage (GCS): Architect data lake structures, access patterns, and lifecycle policies. Technical Delivery Partner with architects on technical design and platform standards. Conduct code reviews and enforce engineering best practices. Troubleshoot and optimize pipeline performance and cost. Implement data quality checks, monitoring, and observability. Support production deployments and incident resolution. Collaboration & Mentorship Mentor junior engineers on GCP best practices and data engineering patterns. Collaborate with analysts, data scientists, and cross-functional teams to enable data consumption. Document technical solutions and maintain knowledge-base artifacts. Participate in agile ceremonies and coordinate with onshore/offshore teams. Required Qualifications Experience 5+ years of Data Engineering experience; minimum 2 years working with GCP. Hands-on experience with BigQuery at enterprise scale and production pipelines. Strong background in building production-grade batch and streaming pipelines. Development experience in Python and/or Java. Proven track record of delivering complex data solutions. Core Technical Skills BigQuery (SQL optimization, scripting, stored procedures, partitioning). Apache Beam / Dataflow (batch & streaming). Airflow / Cloud Composer (DAG development). SQL and Python programming proficiency. Git, CI/CD pipelines and basic DevOps familiarity. Additional Skills Dataform or dbt experience. Infrastructure-as-Code (Terraform basics). Data modeling and schema design. API integration (REST) and system interfacing. Cloud Monitoring, logging, and data quality frameworks. Preferred / Nice-to-Have GCP Professional Data Engineer certification. Experience with Vertex AI for ML pipeline integration. Cloud Functions for serverless processing. Dataproc / Spark workloads. Real-time streaming architectures and low-latency processing. Familiarity with data governance tools (Atlan, Collibra, Dataplex). Experience with legacy system migrations. BI tools experience (Power BI, Looker). Containerization (Docker) and orchestration (Kubernetes). What We’re Looking For Strong problem-solving and analytical skills. Self-motivated and able to work independently in a remote environment. Excellent communication, documentation, and stakeholder management skills. Passion for data engineering, continuous learning, and mentoring others. Collaborative team player with a delivery-focused mindset.