We are looking forward to hire the services of " Splunk Developer " This is full time remote contract position . Initial contract will be offered for 3 months. You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 5+ years Role Description This is a full-time remote role for a Splunk Developer. The Splunk Developer will be responsible for developing, configuring, and maintaining Splunk systems. Day-to-day tasks include creating Splunk queries, developing dashboards and alerts, and optimizing performance for SPL (Search Processing Language) searches. This role requires working closely with other IT teams to ensure seamless integration of Splunk solutions with other enterprise systems. Qualifications Proficiency in Splunk, including development, configuration, and maintenance Experience with creating Splunk queries, dashboards, and alerts Knowledge of optimizing performance for SPL searches Strong understanding of data indexing, parsing, and ingestion Excellent analytical and problem-solving skills Strong communication and collaboration skills, capable of working remotely Experience with integrating Splunk solutions in enterprise environments is a plus Bachelor's degree in Computer Science, Information Technology, or related field
This is full time remote contract position. You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 5+ years About role : Our client is about to start an ERP replacement. They plan to move away from the AWS platform and move to an Azure data lake feeding Snowflake. We need a resource who can be Snowflake thought leader and having Microsoft azure data engineering expertise. Key Responsibilities: Data Ingestion & Orchestration (Transformation & Cleansing) : o Design and maintain Azure Data Factory (ADF) pipelines : Extract data from sources like ERPs (SAP, Oracle), UKG, SharePoint, and REST APIs. o Configure scheduled/event-driven loads : Set up ADF for automated data ingestion. o Transform and cleanse data : Develop logic in ADF for Bronze-to-Silver layer transformations. o Implement data quality checks : Ensure accuracy and consistency. · Snowflake Data Warehousing: o Design/optimize data models: Create tables, views, and stored procedures for Silver and Gold layers. o ETL/ELT in Snowflake: Transform curated Silver data into analytical Gold structures. o Performance tuning: Optimize queries and data loads. Design, develop, and optimize data models within Snowflake, including creating tables, views, and stored procedures for both Silver and Gold layers. o Implement ETL/ELT processes within Snowflake to transform curated data (Silver) into highly optimized analytical structures (Gold) Data Lake Management: o Implement Azure Data Lake Gen2 solutions : Follow medallion architecture (Bronze, Silver). o Manage partitioning, security, governance: Ensure efficient and secure data storage. · Collaboration & Documentation: Partner with stakeholders to convert data needs into technical solutions, document pipelines and models, and uphold best practices through code reviews. Monitoring & Support: Track pipeline performance, resolve issues, and deploy alerting/logging for proactive data integrity and issue detection. · Data visualization tools : Proficient in like Power BI, DAX, and Power Query for creating insightful reports. Skilled in Python for data processing and analysis to support data engineering tasks. Required Skills & Qualifications: Over 5+ years of experience in data engineering, data warehousing, or ETL development. Microsoft Azure proficiency: Azure Data Factory (ADF): Experience in designing, developing, and deploying complex data pipelines. Azure Data Lake Storage Gen2: Hands-on experience with data ingestion, storage, and organization. Expertise in Snowflake Data Warehouse and ETL/ELT: Understanding Snowflake architecture. SQL proficiency for manipulation and querying. Experience with Snowpipe, tasks, streams, and stored procedures. Strong understanding of data warehousing concepts and ETL/ELT principles. Data Formats & Integration : Experience with various data formats (e.g., Parquet, CSV, JSON) and data integration patterns. Data Visualization: Experience with Power BI, DAX, Power Query. Scripting: Python for data processing and analysis. Soft Skills: Problem-solving, attention to detail, communication, and collaboration Nice-to-Have Skills : Version control (e.g., Git), Agile/Scrum methodologies and Data governance and security best practices.
This is full time remote contract position. (So no freelancing or moonlighting possible) You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 5+ years About role : Our client is about to start an ERP replacement. They plan to move away from the AWS platform and move to an Azure data lake feeding Snowflake. We need a resource who can be Snowflake thought leader and having Microsoft azure data engineering expertise. Key Responsibilities: Data Ingestion & Orchestration (Transformation & Cleansing) : o Design and maintain Azure Data Factory (ADF) pipelines : Extract data from sources like ERPs (SAP, Oracle), UKG, SharePoint, and REST APIs. o Configure scheduled/event-driven loads : Set up ADF for automated data ingestion. o Transform and cleanse data : Develop logic in ADF for Bronze-to-Silver layer transformations. o Implement data quality checks : Ensure accuracy and consistency. · Snowflake Data Warehousing: o Design/optimize data models: Create tables, views, and stored procedures for Silver and Gold layers. o ETL/ELT in Snowflake: Transform curated Silver data into analytical Gold structures. o Performance tuning: Optimize queries and data loads. Design, develop, and optimize data models within Snowflake, including creating tables, views, and stored procedures for both Silver and Gold layers. o Implement ETL/ELT processes within Snowflake to transform curated data (Silver) into highly optimized analytical structures (Gold) Data Lake Management: o Implement Azure Data Lake Gen2 solutions : Follow medallion architecture (Bronze, Silver). o Manage partitioning, security, governance: Ensure efficient and secure data storage. · Collaboration & Documentation: Partner with stakeholders to convert data needs into technical solutions, document pipelines and models, and uphold best practices through code reviews. Monitoring & Support: Track pipeline performance, resolve issues, and deploy alerting/logging for proactive data integrity and issue detection. · Data visualization tools : Proficient in like Power BI, DAX, and Power Query for creating insightful reports. Skilled in Python for data processing and analysis to support data engineering tasks. Required Skills & Qualifications: Over 5+ years of experience in data engineering, data warehousing, or ETL development. Microsoft Azure proficiency: Azure Data Factory (ADF): Experience in designing, developing, and deploying complex data pipelines. Azure Data Lake Storage Gen2: Hands-on experience with data ingestion, storage, and organization. Expertise in Snowflake Data Warehouse and ETL/ELT: Understanding Snowflake architecture. SQL proficiency for manipulation and querying. Experience with Snowpipe, tasks, streams, and stored procedures. Strong understanding of data warehousing concepts and ETL/ELT principles. Data Formats & Integration : Experience with various data formats (e.g., Parquet, CSV, JSON) and data integration patterns. Data Visualization: Experience with Power BI, DAX, Power Query. Scripting: Python for data processing and analysis. Soft Skills: Problem-solving, attention to detail, communication, and collaboration Nice-to-Have Skills : Version control (e.g., Git), Agile/Scrum methodologies and Data governance and security best practices.
This is full time remote contract position. (So no freelancing or moonlighting possible) You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 4+ years Key Responsibilities : Lead architecture and design of business processes in PRPC (Pega ) and BPMN. Collaborate with BA and stakeholders to understand process and data needs. Configure and customize business rules, workflows, and UI components. Guide and mentor developers. Review code and ensure alignment with best practices. Integrate Pega with REST/SOAP APIs, third-party systems, and cloud infrastructure. Participate in Agile ceremonies and planning. Skills Required : Pega PRPC (must) BPMN, Java, SQL, REST, SOAP HTML, CSS, JavaScript (for front-end customization) Agile Methodology Cloud & DevOps familiarity (CI/CD, containerization, monitoring)
This is full time remote contract position . (So no freelancing or moonlighting possible) You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 3+ years Key Responsibilities Write and execute test cases for business processes. Test REST/SOAP APIs, front-end UI, and backend services. Perform regression, functional, and integration testing. Work with developers and BAs to resolve defects. Contribute to automation testing if applicable. Skills Required Manual and basic automation testing REST API testing tools (Postman, SoapUI) SQL queries for data validation Agile Testing, Defect Management (Jira, Zephyr)
This is f ull time remote contract position. (So no freelancing or moonlighting possible ) You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 5+ years Job Summary The PLM Administrator is responsible for the configuration, maintenance, and support of the organization’s PLM system, ensuring optimal performance, data integrity, and user adoption. This role works closely with engineering, manufacturing, quality, and IT teams to support product data management and lifecycle processes from design to end-of-life. Key Responsibilities Serve as the primary administrator for Oracle Cloud Fusion PLM, overseeing day-to-day operations, monitoring, and support. Configure and maintain PLM modules such as Product Development, Product Hub, Innovation Management, and Quality Management . Manage user access, roles, and security in accordance with corporate governance and compliance standards. Perform system configuration, functional setups, and environment maintenance , including patches and quarterly release updates. Partner with business users to gather requirements and translate them into system configurations or enhancements . Support data migration, integration, and master data management across ERP, SCM, and other enterprise applications. Troubleshoot issues, perform root cause analysis, and provide timely resolution for PLM-related incidents. Collaborate with IT, business stakeholders, and Oracle support for issue resolution and system enhancements. Develop and maintain system documentation, standard operating procedures, and training materials . Ensure compliance, data integrity, and performance optimization of PLM applications. Lead or support system testing, validation, and change management processes. Qualificiation Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field (or equivalent experience). 5+ years of experience with Oracle PLM, including at least 3+ years in Oracle Cloud Fusion PLM administration . Strong knowledge of PLM processes: Product Development, Change Management, Quality, Compliance, Item & BOM Management . Hands-on expertise with Oracle Cloud roles, security console, functional setups, and user provisioning . Experience with data migration, FBDI, OTBI, BI Publisher, and integration tools (OIC/REST/SOAP APIs) . Understanding of cloud architecture, release management, and SaaS administration best practices . Strong analytical and problem-solving skills with the ability to troubleshoot complex PLM issues. Excellent communication, documentation, and collaboration skills. Preferred: Experience with Agile PLM to Fusion PLM migration , or exposure to ERP/SCM Cloud integration . Preferred Skills Oracle Cloud certifications (PLM/SCM/ERP). Knowledge of regulatory compliance requirements (ISO, FDA, ITAR, etc.). Familiarity with Agile methodologies and DevOps practices for PLM administration. Experience in manufacturing, life sciences, or high-tech industries .
This is f ull time remote contract position. (So no freelancing or moonlighting possible ) You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 6+ years Job Summary: We are seeking a highly skilled Data Engineer with strong hands-on experience in Snowflake and DBT(Data Build Tool) to join our data engineering team. The ideal candidate will be responsible for designing and developing scalable data pipelines, performing advanced data transformations, and ensuring data quality using modern data stack technologies. Key Responsibilities: · Design, develop, and optimize data pipelines using dbt and Snowflake . · Build efficient, reliable, and scalable data transformation models with dbt Core or dbt Cloud . · Implement Snowflake features such as Snowpipe, Streams, Tasks, and Dynamic Tables . · Work closely with Data Analysts and Analytics Engineers, and Business teams to understand data requirements. · Ensure data quality and perform rigorous data testing and validation using dbt tests . · Maintain and enhance the data warehouse architecture to support business intelligence and reporting needs. · Monitor data pipeline performance and troubleshoot issues proactively. · Apply version control practices (Git) and CI/CD for data workflows. · Strong proficiency in Python. The resource should be comfortable writing production-grade Python code, interacting with APIs to extract and integrate data from various sources, and automating workflows. · Experience with handling large-scale data ingestion, transformation, and processing tasks, ensuring data quality, reliability, and scalability across platforms. Required Skills & Qualifications: · 6+ years of experience in Data Engineering. · Strong hands-on experience with Snowflake – including data modeling, performance tuning, and administration. · Advanced proficiency in dbt (Core or Cloud) for data transformations and testing. · Proficient in SQL (complex queries, CTEs, window functions, optimization). · Experience with ETL/ELT design patterns and tools like Apache Nifi, Airflow, and Fivetran. · Solid understanding of data warehousing concepts , dimensional modeling , and medallion architecture . · Experience with AWS Cloud Platforms is a must OR experience with other cloud service providers like Azure, or GCP is a plus. · Familiarity with Git/GitHub and version-controlled deployment pipelines. · Excellent communication skills and ability to work in cross-functional teams. · Demonstrated ability to thrive in fast-paced environments. The resource should have a strong aptitude to be comfortable diving deep into datasets, identifying patterns, and uncovering data quality issues in environments where data sanity is low.
This is full time remote contract position. (So no freelancing or moonlighting possible) You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process . This is client facing role hence excellent communication in English language is MUST . Apply only if you have hands on experiene working with process of Customer Decision Handling (CDH), Next Best Action (NBA), etc. This skill requirement is MUST and Non Negotiable. Min. Experience : 4+ years Key Responsibilities: Lead architecture and design of business processes in PRPC (Pega ) and BPMN. Collaborate with BA and stakeholders to understand process and data needs. Configure and customize business rules, workflows, and UI components. Guide and mentor developers. Review code and ensure alignment with best practices. Integrate Pega with REST/SOAP APIs, third-party systems, and cloud infrastructure. Participate in Agile ceremonies and planning. Skills Required: Pega PRPC (must) Customer Decision Handling, Next Best Action processes BPMN, Java, SQL, REST, SOAP HTML, CSS, JavaScript (for front-end customization) Agile Methodology Cloud & DevOps familiarity (CI/CD, containerization, monitoring) Employment Type Contract
Location : Ahmedabad, Gujarat Job Type : Full time / Hybrid / Freelancer Job Title : Business Development Executive Experience Range : 3 - 5 years Key Responsibilities • Identify and Generate Leads: Research and identify potential clients through various channels, including networking, cold calling, and attending industry events. • Client Engagement: Build and nurture relationships with new and existing clients to understand their needs and provide tailored IT solutions. • Sales Presentations: Prepare and deliver compelling sales presentations and proposals to prospective clients. • Negotiation and Closing: Negotiate contracts and close deals to achieve or exceed sales targets. • Market Research: Stay updated on industry trends, market conditions, and competitors to identify new business opportunities. • Collaboration: Work closely with the technical and project management teams to ensure successful project delivery and client satisfaction. • Reporting: Using CRM software, maintain accurate records of sales activities, client interactions, and pipeline status. Qualifications • Bachelor’s/Master’s degree in Business, Marketing, or a related field. • Proven experience in sales, in the IT industry for overseas market. • We prefer prior experience of selling IT solutions integrated with hardware systems • Strong understanding of IT solutions and services. • Excellent communication, negotiation, and presentation skills. • Ability to work independently and as part of a team. • Proficiency in using CRM software and other sales tools. • Willingness to travel as needed. • Should be able and willing to work during overlapping time zones with USA
This is full time remote contract (2 - 3 months to start with) position. (So no freelancing or moonlighting possible) You may need to provide atleast 4 hours of overlapping time with US timezone ( Phoenix, AZ). This is client facing role hence excellent communication in English language is MUST . Min. Experience : 3+ years About the Job We're looking for a Test Automation Engineer who can independently drive quality initiatives while helping triage complex production issues. You'll build sophisticated automated testing solutions, provide technical guidance to the team, and work closely with support to ensure customer-reported bugs are quickly understood and resolved. Key Responsibilities: Test Automation & Quality Engineering (65-70%) Create comprehensive test plans and test cases based on product requirements and user stories Design and implement automated test suites using Playwright for web applications Develop complex test scenarios including end-to-end workflows and integration tests Implement CI/CD pipeline integration for automated testing across multiple environments Write clean, maintainable, and well-documented automation code Optimize test execution performance and reduce flakiness in automated tests Manage test data strategies and generate synthetic data for various testing scenarios Conduct performance, load, and security testing. Develop and maintain test reporting and analytics dashboards to track quality metrics Perform database validation and create SQL queries for data verification testing Work with AWS services to set up and maintain test environments and infrastructure Leverage AWS CloudWatch for monitoring test execution and analyzing logs Quality Advocacy & Collaboration (15-20%) Collaborate with developers to ensure new features are designed with testability in mind Advocate for quality through a "shift-left" mindset, pushing back when engineers have not done proper testing Participate in design and architecture reviews to identify potential quality risks early Provide technical guidance to the team on testing best practices and methodologies Identify gaps in test coverage and proactively recommend improvements Champion quality standards and testing processes across the engineering organization Bug Triage & Technical Investigation (15-20%) Independently investigate and reproduce complex bugs reported by customer support Create detailed, actionable bug reports with reproduction steps, logs, and diagnostic information Collaborate with support on potential workaround Qualifications Strong programming skills in at least one language (Python, JavaScript/TypeScript, Java, or C#) Hands-on experience with Playwright for test automation Proven experience implementing CI/CD pipeline integrations for automated testing Working knowledge of AWS services (EC2, S3, RDS, Lambda, CloudWatch, or similar) Strong experience with test data management and synthetic data generation Experience with performance and load testing tools (JMeter, k6, Gatling, or similar) Knowledge of security testing principles and tools (OWASP, Burp Suite, or similar) Experience with database technologies and SQL for data validation Experience with test reporting and analytics tools Demonstrated ability to write clean, maintainable, and well-documented code Proven track record of optimizing test execution and reducing test flakiness Strong analytical and debugging skills for complex technical investigations Experience with GitHub Actions for automated testing workflows and CI/CD pipelines Employment Type Contract