Epikso India

2 Job openings at Epikso India
Business Development Manager gurugram 5 - 7 years INR 6.0 - 12.0 Lacs P.A. Remote Full Time

Hi, I hope this message finds you well. Thank you for your interest in the Business Development Manager role with us at Epik Solutions Were excited to move forward and would like to confirm a few details to ensure alignment on the position requirements and expectations. Please review the job description summary below and confirm your acknowledgment by replying to this email. If you have any questions, feel free to let us know. Position : Business Development Manager Shift Timings: 7:00 PM to 4:30 AM Location: Remote About Company: Epik Solutions is a global technology company that embraces creativity and diversity, using technology to inspire and implement solutions that meet our customers needs.Capabilities include: - Digital Transformation: We help clients reimagine and capture possibilities while strengthening the performance of their existing digital assets. Our Team brings ideas to life and builds organizations’ capabilities to deliver their best outcomes on an ongoing basis. - Workforce Transformation: We partner with companies that are seeking to hone their business strategies and amplify performance. Epik Solutions helps optimize workforce accessibility and capacity to produce real business results. Job Description: Epik Solutions is seeking a motivated and results-driven Business Development Manager with a strong focus on lead generation, outreach, and sales pipeline management . This role involves identifying high-quality leads, conducting targeted outreach campaigns, and using tools such as HubSpot , ZoomInfo , and CoPilot to drive top-of-funnel activity and support overall sales growth. The ideal candidate is highly skilled in prospecting, tech-savvy, and thrives in a fast-paced B2B sales environment. Key Responsibilities: 1. Lead Generation & Outreach Proactively identify and qualify potential leads through tools like ZoomInfo , LinkedIn, and other data enrichment platforms. Launch and manage personalized outreach campaigns using CoPilot and HubSpot . Develop email sequences, call scripts, and messaging strategies tailored to target personas. Monitor and optimize outreach performance metrics (open rates, response rates, etc.). 2. Sales Pipeline Management Track and manage leads, prospects, and sales activities in HubSpot CRM . Maintain accurate and up-to-date records of prospect interactions and activities. Nurture leads through the pipeline with timely follow-ups and relevant communication. 3. Sales Reporting & Optimization Analyze outreach and sales performance data to identify trends and improvement areas. Provide regular reporting on lead generation effectiveness, conversion rates, and pipeline health. A/B test outreach strategies to continuously improve engagement and conversions. 4. Cross-Functional Collaboration Coordinate with marketing teams to align outreach efforts with campaigns and messaging. Provide feedback on lead quality and market response to optimize targeting and campaigns. Collaborate with sales executives for seamless hand-off of qualified leads. 5. Tool Utilization & Automation Leverage HubSpot for campaign management, lead tracking, and CRM updates. Use ZoomInfo for advanced lead research and segmentation. Maximize the use of CoPilot and other automation tools for efficient and scalable outreach. Qualifications: Bachelor’s degree in Business, Marketing, or a related field. 5+ years of experience in B2B lead generation and outbound sales , preferably in the technology or SaaS industry. Hands-on experience with HubSpot , ZoomInfo , CoPilot , and other sales enablement tools. Proven ability to generate qualified leads and build robust sales pipelines. Excellent written and verbal communication skills. Data-driven mindset with strong analytical and reporting skills. Ability to work independently, manage priorities, and meet targets consistently.

GCP Data Engineer gurugram 5 - 10 years INR 17.0 - 32.0 Lacs P.A. Remote Full Time

Hi, I hope this message finds you well. Thank you for your interest in the Sr. GCP Data Engineer role with us at Epik Solutions Were excited to move forward and would like to confirm a few details to ensure alignment on the position requirements and expectations. Please review the job description summary below and confirm your acknowledgment by replying to this email. If you have any questions, feel free to let us know. Position : Sr. GCP Data Engineer Location: Work From Home About Company: Epik Solutions is a global technology company that embraces creativity and diversity, using technology to inspire and implement solutions that meet our customers’ needs. Capabilities include: - Digital Transformation: We help clients reimagine and capture possibilities while strengthening the performance of their existing digital assets. Our Team brings ideas to life and builds organizations’ capabilities to deliver their best outcomes on an ongoing basis. - Workforce Transformation: We partner with companies that are seeking to hone their business strategies and amplify performance. Epik Solutions helps optimize workforce accessibility and capacity to produce real business results. Job Description: As a GCP Data Engineer, your role will involve designing, developing, and maintaining data solutions on the Google Cloud Platform . You will be responsible for building and optimizing data pipelines, ensuring data quality and reliability, and implementing data processing and transformation logic. Your expertise in Databricks , Python , SQL , PySpark / Scala , and Informatica will be essential for performing the following key responsibilities: Key Responsibilities: Designing and developing data pipelines: Design and implement scalable and efficient data pipelines using GCP-native services (e.g., Cloud Composer, Dataflow, BigQuery) and tools like Databricks , PySpark , and Scala . This includes data ingestion, transformation, and loading (ETL/ELT) processes. Data modeling and database design: Develop data models and schema designs to support efficient data storage and analytics using tools like BigQuery , Cloud Storage , or other GCP-compatible storage solutions. Data integration and orchestration: Orchestrate and schedule complex data workflows using Cloud Composer (Apache Airflow) or similar orchestration tools. Manage end-to-end data integration across cloud and on-premises systems. Data quality and governance: Implement data quality checks, validation rules, and governance processes to ensure data accuracy, integrity, and compliance with organizational standards and external regulations. Performance optimization: Optimize pipelines and queries to enhance performance and reduce processing time, including tuning Spark jobs, SQL queries, and leveraging caching mechanisms or parallel processing in GCP. Monitoring and troubleshooting: Monitor data pipeline performance using GCP operations suite (formerly Stackdriver) or other monitoring tools. Identify bottlenecks and troubleshoot ingestion, transformation, or loading issues. Documentation and collaboration: Maintain clear and comprehensive documentation for data flows, ETL logic, and pipeline configurations. Collaborate closely with data scientists, business analysts, and product owners to understand requirements and deliver data engineering solutions. Skills and Qualifications: 5+ years of experience in a Data Engineer role with exposure to large-scale data processing. Strong hands-on experience with Google Cloud Platform (GCP) , particularly services like BigQuery , Cloud Storage , Dataflow , and Cloud Composer . Proficient in Python and/or Scala , with a strong grasp of PySpark . Experience working with Databricks in a cloud environment. Solid experience building and maintaining big data pipelines, architectures, and data sets. Strong knowledge of Informatica for ETL/ELT processes. Proven track record of manipulating, processing, and extracting value from large-scale, unstructured datasets. cWorking knowledge of stream processing and scalable data stores (e.g., Kafka, Pub/Sub, BigQuery). Solid understanding of data modeling concepts and best practices in both OLTP and OLAP systems. Familiarity with data quality frameworks , governance policies, and compliance standards. Skilled in performance tuning , job optimization, and cost-efficient cloud architecture design. Excellent communication and collaboration skills to work effectively in cross-functional and client-facing roles. Bachelor's degree in Computer Science , Information Systems , or a related field (Mathematics, Engineering, etc.). Bonus: Experience with distributed computing frameworks like Hadoop and Spark.