Role & responsibilities: Experience: 6+ years of experience in data analysis, with at least 2+ years of experience in DataVault modeling. Prior experience in the financial services domain is highly preferred. Technical Skills: Strong proficiency in SQL and hands-on experience with DataVault 2.0 methodology. Familiarity with data analysis tools like Python, R, or SAS. Experience with ETL/ELT tools and cloud data platforms (e.g., Azure Synapse, AWS Redshift, or GCP BigQuery). Knowledge of Wherescape 3D and RED for modelling DataVault Data Visualization: Proficiency in creating dashboards and reports using tools such as Power BI, Tableau, or Qlik. Soft Skills: Excellent analytical thinking and problem-solving abilities. Strong communication skills to effectively collaborate with technical and non-technical stakeholders. Knowledge of Financial Services: Understanding of key financial metrics and regulatory requirements, such as Basel III or SOX compliance.
Key Responsibilities : Lead the design and development of scalable data pipelines using PySpark and ETL frameworks on Google Cloud Platform (GCP) . Own end-to-end data architecture and solutions, ensuring high availability, performance, and reliability. Collaborate with data scientists, analysts, and stakeholders to understand data needs and deliver actionable insights. Optimize complex SQL queries and support advanced data transformations. Ensure best practices in data governance, data quality, and security . Mentor junior engineers and contribute to team capability development. Requirements : 8+ years of experience in data engineering roles. Strong expertise in GCP data services (BigQuery, Dataflow, Pub/Sub, Composer, etc.). Hands-on experience with PySpark and building ETL pipelines at scale. Proficiency in SQL with the ability to write and optimize complex queries. Solid understanding of data modeling, warehousing, and performance tuning. Experience with CI/CD pipelines, version control, and infrastructure-as-code is a plus. Excellent problem-solving and communication skills. Preferred Qualifications : GCP Certification (e.g., Professional Data Engineer). Experience with Airflow, Kubernetes, or Terraform.
Virtual Recruitment Drive Senior Business Analyst (2 - 4 Years Experience) Immediate Joiners Preferred Preferred Candidates:- (Kindly go through the JD well ) Project Experience : 24 years of experience in strategy consulting or advanced analytics projects , with a strong business problem-solving mindset. Core Tools : Proficient in Advanced Excel , dashboarding (Power BI/Tableau), and PowerPoint for delivering client-ready insights and visualizations. Analytical Skills : Ability to work with complex data, identify trends, and translate findings into clear, actionable business recommendations. Technical Skills (Good to Have) : Exposure to SQL and Python for data manipulation and advanced analytics. Domain Exposure (Preferred) : Experience in Retail , FMCG , or CPG sectors is an added advantage. Communication & Collaboration : Strong communication skills with the ability to work cross-functionally and present to senior stakeholders.
Designed, developed and delivered multiple machine learning projects including implementation in production Worked on a variety of Data Science projects Experience converting a business problem into a Data Science problem Strong grasp of ML and Time Series fundamentals and their applications in business Proficiency in Python (including pandas/scikit-learn) Experience in leading or guiding a team of data scientists/ML engineers Familiarity with relational databases and intermediate level knowledge of SQL A team player who can deliver under pressure An eye for detail, quick learner and can think out of the box Familiarity with cloud platforms Experience working with large data sets and tools like MapReduce, Hadoop, Hive, etc. and large data streaming technologies like Spark, etc. is a plus.
Role & responsibilities Application Development: Design, develop, and maintain scalable applications using .NET technologies, ensuring high performance on desktop and web platforms. ETL Process Management: Develop and maintain ETL processes using SSIS to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Database Management: Design and develop complex SQL queries, stored procedures, functions, and views for data retrieval and manipulation. Collaboration: Work closely with cross-functional teams to gather requirements, define, design, and ship new features, ensuring alignment with business objectives. Performance Optimization: Monitor application and ETL pipeline performance, identify bottlenecks, and implement solutions to address these issues. Security and Compliance: Ensure application and data security by applying appropriate infrastructure, security patches, and adhering to compliance requirements. Code Quality: Write clean, maintainable, and efficient code that complies with company standards and best practices. Participate in code reviews to ensure code quality and distribute knowledge. Documentation: Maintain comprehensive documentation for code, processes, and procedures to facilitate knowledge sharing and system maintenance. Continuous Improvement: Stay abreast of the latest and emerging technologies in software development, including AI/ML, and incorporate new technologies as appropriate into solution design. Preferred candidate profile Proficiency in .NET Framework, C#, ASP.NET, MVC, Web API, and JavaScript frameworks (e.g., React/Angular). Extensive experience with SQL Server, including writing complex queries, stored procedures, and functions. Hands-on experience with SSIS for developing ETL processes. Familiarity with SSRS for report generation. Experience with Azure DevOps and CI/CD pipelines. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to mentor junior developers and foster a positive team environment.
We are seeking a highly creative and experienced Senior Graphic Designer to lead design initiatives across digital platforms. The ideal candidate has a strong portfolio in UI/UX , proficiency in AI design tools , and expertise in Illustration, Photoshop, PowerPoint, and Canva . Youll collaborate cross-functionally to deliver high-impact visual solutions that align with brand and user needs. Key Responsibilities: Design intuitive and engaging UI/UX for web and mobile applications Develop high-quality illustrations, graphics, and presentations (PPTs) Use AI tools to streamline design workflows and generate creative assets Ensure brand consistency across all visual touchpoints Collaborate with product, marketing, and development teams Mentor junior designers and lead creative direction when needed Requirements: 5+ years of experience in graphic and UI/UX design Proficient in Figma/Adobe XD, Photoshop, Illustrator, Canva, and PowerPoint Experience with AI design tools (e.g., Midjourney, DALL•E, Runway, etc.) Strong visual, typographic, and storytelling skills Ability to handle multiple projects in a fast-paced environment
Role & responsibilities • Assume ownership of Data Engineering projects from inception to completion. Implement fully operational Unified Data Platform solutions in production environments using technologies like Databricks, Snowflake, Azure Synapse etc. Showcase proficiency in Data Modelling and Data Architecture Utilize modern data transformation tools such as DBT (Data Build Tool) to streamline and automate data pipelines (nice to have). Implement DevOps practices for continuous integration and deployment (CI/CD) to ensure robust and scalable data solutions (nice to have). Maintain code versioning and collaborate effectively within a version-controlled environment. Familiarity with Data Ingestion & Orchestration tools such as Azure Data Factory, Azure Synapse, AWS Glue etc. Set up processes for data management, templatized analytical modules/deliverables. Continuously improve processes with focus on automation and partner with different teams to develop system capability. Proactively seek opportunities to help and mentor team members by sharing knowledge and expanding skills. Ability to communicate effectively with internal and external stakeholders. Coordinating with cross-functional team members to make sure high quality in deliverables with no impact on timelines Preferred candidate profile • Expertise in computer programming languages such as: Python and Advance SQL • Should have working knowledge of Data Warehousing, Data Marts and Business Intelligence with hands-on experience implementing fully operational data warehouse solutions in production environments. • 3+ years of Working Knowledge of Big data tools (Hive, Spark) along with ETL tools and cloud platforms. • 3+ years of relevant experience in either Snowflake or Databricks. Certification in Snowflake or Databricks would be highly recommended. • Proficient in Data Modelling and ELT techniques. • Experienced with any of the ETL/Data Pipeline Orchestration tools such as Azure Data Factory, AWS Glue, Azure Synapse, Airflow etc. • Experience working with ingesting data from different data sources such as RDBMS, ERP Systems, APIs etc. • Knowledge of modern data transformation tools, particularly DBT (Data Build Tool), for streamlined and automated data pipelines (nice to have). • Experience in implementing DevOps practices for CI/CD to ensure robust and scalable data solutions (nice to have). • Proficient in maintaining code versioning and effective collaboration within a versioncontrolled environment. • Ability to work effectively as an individual contributor and in small teams. Should have experience mentoring junior team members. • Excellent problem-solving and troubleshooting ability with experience of supporting and working with cross functional teams in a dynamic environment. • Strong verbal and written communication skills with ability to communicate effectively, articulate results and issues to internal and client team.
Preferred candidate profile Mandatory Should be an immediate joiner or max notice period 15 Days. Should have min 4 years of experience in Strategy & advanced analytics. Should be well versed with Advance Excel & PowerPoint . Good to have Experience in domains - CPG , Retail & FMCG
We are looking for a technically strong Lead Developer with expertise in Power BI development and .NET Core. The ideal candidate should have deep experience in data modeling, DAX, SQL, and building scalable backend applications using .NET Core. Key Responsibilities: Design and develop interactive dashboards and reports in Power BI. Write optimized DAX and SQL queries for complex data transformations. Build and maintain backend APIs and services using .NET Core. Collaborate with data engineers, analysts, and business users to understand reporting requirements. Ensure performance optimization, security, and scalability of both reporting and backend layers. Lead a team of developers and guide them in technical decisions and best practices. Troubleshoot data, report, or service-level issues and provide timely resolutions. Requirements: 7+ years of experience in software development. Strong hands-on experience with Power BI, DAX, and SQL. Solid expertise in .NET Core, C#, and REST API development. Experience integrating Power BI with backend systems. Good understanding of data modeling and ETL pipelines. Experience working with Agile teams and leading technical deliveries. Excellent communication and stakeholder management skills. Good to Have: Knowledge of Azure services, Power BI Embedded, or DevOps practices. Experience with performance tuning and handling large datasets.
About the Company: Decision Point is a fast-growing Analytics & Big Data Solutions company with business partners featuring in global Fortune 500 list. Specializing in applying math to solve complex business problems in the Consumer Packaged Goods (CPG) space, Decision Point assures you a 360-degree learning platform considered the steepest in the industry. You will develop in-depth know-how of CPG/FMCG business and learn the application of analytics and big data technologies. The entrepreneurial environment at Decision Point offers exposure to advanced analytics in the sales and marketing domain. The team comprises young, energetic, and highly skilled individuals including talented business consultants, data scientists, and engineers who are passionate about executing solutions for real-world business problems. A career at Decision Point is rich in experience and offers opportunities to build lasting relationships. Role and Responsibilities: As a Project Manager at Decision Point, your responsibilities will include gathering and analyzing project requirements for business and technical completeness. You will own product sprints, facilitate the development of user stories and use cases to support functional designs, and have complete ownership of end-to-end aspects of tool development. This involves understanding tool business objectives, datasets, wireframes, delivery, and maintenance plans. You are expected to have a strong track record in working in Analytical Product Development teams or leading them, with a basic understanding of ETL/Model Training Pipelines & Cloud Services & Platforms. Collaborating closely with consulting, development, and technical teams is crucial to ensure all business and technical requirements are incorporated into design and builds. Your role also involves shaping the strategy for product development, driving day-to-day project team activities to meet milestones, and using strong program management, organizational, and leadership skills to manage resources and critical path activities effectively. You will be responsible for collecting, analyzing, integrating, and maintaining key cross-functional deliverables, defining and managing critical path activities, facilitating project risk management, identifying deviations from approved project plans, managing resolution, and ensuring appropriate verification for delivered solutions. Requirements: - Bachelor's degree in Computer Science, Computer Engineering, or a related technical discipline - Minimum 4 years of professional experience managing complex technology projects - Hands-on coding experience - Minimum 2-3 years of work experience in Supply Chain Planning and Network Optimization - Experience in delivering large, cross-functional projects - Exceptional written and verbal communication skills - Experience in software development life cycle from conception to delivery Benefits: - Broaden Knowledge Base: Opportunity to broaden knowledge base on existing solutions deploying advanced analytics techniques incorporating AI/ML and staying updated with recent industry trends. - Direct Client Interaction: Direct client interaction to fasten learning process and have a holistic understanding of solution features and requirements. - Impact on Business: Witness the impact of implementing recommendations/strategies by working closely with clients on a day-to-day basis. - Fast-Track Career Growth: Fast-track career path for ambitious individuals offering high rewards, challenging roles, and annual promotions with handsome raises. - Young and Dynamic Culture: A youthful and energetic work environment with flexibility and work-life balance, occasional team retreats, and quarterly get-togethers. - Exposure to Leadership Roles: Opportunity to lead projects across technical and consulting domains, enabling end-to-end project execution.,