Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7 - 12 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Data Building Tool Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Develop and maintain data solutions for data generation, collection, and processing. Create data pipelines to ensure efficient data flow. Implement ETL processes for data migration and deployment. Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse. Good To Have Skills: Experience with Data Building Tool, Python (Programming Language). Strong understanding of data architecture and data modeling. Experience in developing and optimizing ETL processes. Knowledge of cloud data platforms and services. Additional Information: The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
15 - 27 Lacs
Bengaluru
Work from Office
SUMMARY This is a remote position. Full Stack Developer Experience : 5-7 years Location: Remote Key Skills : UI Development skills using React JS, JavaScript, HTML5, and CSS3 Spring Frameworks (Spring Boot, Spring Cloud Services, Spring Security, etc.), and REST API Development Oracle, Kafka Preferred Knowledge and Skills : Knowledge of Financial Domain and Trade Life Cycle - BYMEllon Knowledge of Continuous Integration/Continuous Deployment (CI/CD) experience Responsibilities : Designing and developing serverless applications using Azure Function Apps Creating, managing, and optimizing Azure SQL and snowflake databases for scalable data storage solutions Developing and integrating REST APIs for seamless communication between systems Collaborating with cross-functional teams to implement API integration with third-party and internal systems Implementing and maintaining logging to ensure application observability, debugging, and monitoring Writing clean, efficient, and well-documented Python code following best practices Optimizing application performance and scalability for large datasets and high-traffic systems Monitoring, troubleshooting, and resolving production issues in real time Participating in code reviews, technical discussions, and agile development processes Staying updated with emerging technologies and industry trends to drive innovation Requirements Requirements: : 5-10 years of experience as a Full Stack Developer Proficiency in Java, Springboot, ReactJS, Kafka, JavaScript, Maven, and Docker Knowledge of Financial Domain and Trade Life Cycle - BYMEllon Experience with Continuous Integration/Continuous Deployment (CI/CD) Familiarity with Azure Function Apps, Azure SQL, and snowflake databases
Posted 1 month ago
3 - 6 years
12 - 14 Lacs
Hyderabad
Work from Office
Overview Lead – Biddable (Reporting) This exciting role of a Lead – Biddable (Reporting) requires you to creatively manage Biddable media campaigns for our global brands. Your expertise of DSPs and knowledge of the Digital Market Cycle would make you a great fit for this position. This is a great opportunity to work closely with the Top Global brands and own large and reputed accounts. About us We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together > Responsibilities Work with clients and stakeholders on gathering requirements around reporting. Design solutions and mock-ups of reports based on requirements that define every detail. Develop reporting based on marketing data in Excel, Power BI Collaborate with other members of the reporting design team and data & automation team to build and manage complex data lakes that support reporting. Extract reports from the platforms like, DV360, LinkedIn, TikTok, Reddit, Snapchat, Meta, Twitter Organize the reports into easily readable tables Offer comprehensive insights based on the reporting data. Assess data against previous benchmarks and provide judgments/recommendations Direct communication with agencies for projects Managed communication for end clients Preferred: Experience in digital marketing such as paid-search, paid-social, or programmatic display is extremely helpful. Qualifications A full-time graduate degree (Mandatory) A proven history of 6+ years as a marketing reporting Analyst or experience in a similar role with opportunities in this. A solid understanding of paid digital marketing functions is essential to this job. Strong experience working with data such as Excel (Vlookups, SUMIFS, advanced functions are a must). Experience in working with web-based reporting platforms such as Looker is preferred. Strong communication skills with a strong preference on having collaborated with teams in the United States or United Kingdom including gathering requirements or collaborating with teams on solution design.
Posted 1 month ago
12 - 16 years
35 - 40 Lacs
Bengaluru
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 1 month ago
12 - 16 years
35 - 40 Lacs
Chennai
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 1 month ago
12 - 16 years
35 - 40 Lacs
Mumbai
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 1 month ago
12 - 16 years
35 - 40 Lacs
Kolkata
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 1 month ago
7 - 12 years
10 - 19 Lacs
Chandigarh
Remote
Role & responsibilities Should have minimum 6+ years of experience in snowflake DBA admin. Should manage storage SHIR, Spark pool allocation. Should have experience in working with managed private end point. Should have experience in creating maintaining, monitoring etc. firewall rules Should have good communication skill and experience in working in a Global delivery model involving onsite and offshore. Should have experience in Agile methodology. Please share your resume @ Ravina.m@vhrsol.com
Posted 1 month ago
6 - 10 years
0 - 2 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities: Senior Data Engineer Business domain knowledge : Saas, SFDC, Netsuite Areas we support: Product, Finance (ARR reporting), GTM, Marketing, Sales Tech Stack: Fivetran, Snowflake, dbt, Tableau, Github What we are looking for: 1. SQL and data modeling at intermediate level - write complex SQL queries, build data model and experience with data transformation. 2. Problem solver: Person who can weed through ambiguity of the ask 3. Bias for Action: Asks questions, reaches out to stakeholders, comes up with solutions 4. Communication: Effectively communicates with stakeholder and team members 5.Documentation: Can create BRD 6. Someone well versed in Finance (ARR reporting) and/or GTM (sales and marketing) would be an added advantage 7. Experience in SAAS, NetSuite and Salesforce will be a plus 8. Independent, self-starter, motivated and experience with working in an onsite/offshore environment Key is excellent communication, ownership, working with stakeholders in driving requirements
Posted 1 month ago
3 - 6 years
7 - 15 Lacs
Pune, Chennai, Bengaluru
Work from Office
Key Responsibilities: Integration Design and Development: Develop integration solutions using SnapLogic to automate data workflows between Snowflake, APIs, Oracle and other data sources. Design, implement, and maintain data pipelines to ensure reliable and timely data flow across systems. Develop API integrations to facilitate seamless data exchange with internal master data management systems. Monitor and optimize data integration processes to ensure high performance and reliability. Provide support for existing integrations, troubleshoot issues, and suggest improvements to streamline operations. Work closely with cross-functional teams, including data analysts, data scientists, and IT, to understand integration needs and develop solutions. Maintain detailed documentation of integration processes and workflows. Experience: 3-4 years of Proven experience as a SnapLogic Integration Engineer. Experience with Snowflake cloud data platform is preferred. Experience in API integration and development. Familiar with RESTful API design and integration. Strong understanding of ETL/ELT processes. Preferred candidate profile
Posted 1 month ago
5 - 7 years
0 - 0 Lacs
Thiruvananthapuram
Work from Office
Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes: Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code - for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain/technology certification Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware's. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs / modules Functional and technical designing Programming languages - proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile - Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Essential - Analysis and problem-solving capabilities. - Degree in Maths/Physics/Statistics/similar discipline or previous experience of a similar role. - Ability to interpret data, recognize problems as they arise, suggest and implement appropriate actions. - Ability to communicate clearly, both verbally and via detailed specifications and reports. - Excellent organizational skills and capability of working to tight deadlines. - Ability to work to a high level of accuracy. - Experience in designing, developing, and maintaining applications using Python and Pyspark in a Hadoop environment with HDFS. - Experience in AWS services S3, Lambdas, Managed Airflow (MWAA) and EMR Serverless. Desirable - Experience using data dashboarding/presentation tools like PowerBI. - Knowledge of data processing tools like Snowflake. - Experience in using databases like MongoDB and DynamoDB - Any experience in working with object-oriented languages like Java or .NET would be a plus. - Experian in AWS services like Kinesis, API Gateway etc - Awareness of CI/CD tools like Jenkins or Harness Required Skills Python,Aws Services,Pyspark,Hadoop Platform
Posted 1 month ago
5 - 7 years
0 - 0 Lacs
Kolkata
Work from Office
Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Required Skills & Qualifications: - A degree (preferably an advanced degree) in Computer Science, Engineering or a related field - Senior developer having 8+ years of hands on development experience in Azure using ASB and ADF: Extensive experience in designing, developing, and maintaining data solutions/pipelines in the Azure ecosystem, including Azure Service Bus, & ADF. - Familiarity with MongoDB and Python is added advantage. Required Skills Azure Data Factory,Azure Service Bus,Azure,Mongodb
Posted 1 month ago
5 - 7 years
0 - 0 Lacs
Thiruvananthapuram
Work from Office
Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes: Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code - for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain/technology certification Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware's. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs / modules Functional and technical designing Programming languages - proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile - Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Responsibilities: - Understand business requirements and existing system designs, security applications and guidelines, etc. - Work with various SME's in understanding business process flows, functional requirements specifications of existing system, their current challenges and constraints and future expectation. - Streamline the process of sourcing, organizing data (from a wide variety of data sources using Python, PySpark, SQL, Spark) and accelerating data for analysis. - Support the data curation process by feeding the data catalog and knowledge bases. - Create data tools for analytics and data scientist team members that assist them in building and optimizing the data products for consumption. - Work with data and analytics experts to strive for greater functionality in the data systems. - Clearly articulate data stories using data science, advanced statistical analysis, visualization tools, PowerPoint presentations, written and oral communication. - Manage technical, analytical, and business documentation on all data efforts. - Engage in hands on development and work with both onsite and offsite leads and engineers. Competencies: - 5+ years of experience in building data engineering pipelines on both onpremise and cloud platforms (Snowflake) - 5+ years of experience in developing Python based data-applications to support data ingestion, transformation, data visualizations (plotly, streamlit, flask, dask) - Strong experience coding in Python, PySpark, SQL and building automations. - Knowledge of Cybersecurity, IT infrastructure and Software concepts. - 3+ years of experience using data warehousing / data lake techniques in cloud environments. - 3+ years of developing data visualizations using Tableau, Plotly, Streamlit - Experience with ELT/ETL tools like DBT, Cribl, etc. - Experience on capturing incremental data changes, streaming data ingestion and stream processing. - Experience in processes supporting data governance, data structures, metadata management. - Solid grasp of data and analytics concepts and methodologies including data science, data engineering, and data story-telling Required Skills Python,Sql,Cloud Platform
Posted 1 month ago
2 - 4 years
4 - 6 Lacs
Pune
Work from Office
We seek a Data Engineer/Architect Expert Level who shares our passion for innovation and change. This role is critical to helping our business partners evolve and adapt to consumers' personalized expectations in this new technological era. What will help you succeed: Fluent English (B2 - Upper Intermediate) Deep Data Architecture & Time Series Database (Timescale) expertise. Proficiency in Big Data Processing (Apache Spark). Experience with Streaming Data Technologies (KSQL, Flink). Strong knowledge of Data Governance, Security & Compliance. Hands-on experience with Snowflake and data sharing design. Hands on experience on AWS or Azure cloud along with Kafka. This job can be filled in Pune #li-hybrid Create with us digital products that people love. We will bring businesses and consumers together through AI technology and creativity, driving digital transformation to impact the world positively. At Globant, we believe in fostering a diverse and inclusive workplace where everyone feels valued and respected. We are an Equal Opportunity Employer committed to creating a thriving and inclusive environment for all employees and candidates, regardless of race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other legally protected characteristic. If you need any assistance or accommodations due to a disability, please let us know by applying through our Career Site or contacting your assigned recruiter. We may use AI and machine learning technologies in our recruitment process. Compensation is determined based on skills, qualifications, experience, and location. In addition to competitive salaries, we offer a comprehensive benefits package. Learn more about our commitment to diversity and inclusion and .
Posted 1 month ago
6 - 11 years
22 - 35 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Skill Combination: Snowflake + (Python or DBT) + (AWS or Azure) + SQL + Data warehousing Location: Kolkata Exp & CTC: Band Experience CTC Range (Fixed) 4B 4 to 7 years Up to 21 LPA 4C 7 to 11 years Up to 28 LPA 4D 10 to 16 years Up to 35 LPA Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python/DBT+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, DBT, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 1 month ago
3 - 4 years
5 - 6 Lacs
Noida, Gurugram, Bengaluru
Work from Office
Senior Engineer: The T C A practice has experienced significant growth in demand for engineering & architecture roles from CST, driven by client needs that extend beyond traditional data & analytics architecture skills. There is an increasing emphasis on deep technical sk ills like s uch as strong ex pertise i n Azure, Snowflake, Azure OpenAI, and Snowflake Cortex, along with a solid understanding of their respective functionalities. In dividual w ill work on a robust pipeline of T C A-driven projects with pharma clients . This role offers significant opportunities for progression within the practice. What Youll Do Opportunity to work on high-impact projects with leading clients. Exposure to complex and technological initiatives Learning support through organization sponsored trainings & certifications Collaborative and growth-oriented team culture. Clear progression path within the practice. Opportunity work on latest technologies Successful delivery of client projects and continuous learning mindset certifications in newer areas Contribution to partner with project leads and AEEC leads todeliver complex projects & growTCA practice. Development of experttech solutions for client needs with positive feedback from clients and team members. What Youll Bring 3- 4 years of experience in RDF ontologies, RDF based knowledge graph (Anzo graph DB preferred), Data modelling, Azure cloud and data engineering Understanding of ETL processes, Data pull using Azure services via polling mechanism and API/middleware development using Azure services. Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Experience in pharma or life sciences data: Familiarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus.
Posted 1 month ago
10 - 14 years
15 - 22 Lacs
Gurugram
Work from Office
ZS Master Data Management Team has an extensive track record of completing over 1000 global projects and partnering with 15 of the top 20 Global Pharma organizations. They specialize in various MDM domains, offering end-to-end project implementation, change management, and data stewardship support. Their services encompass MDM strategy consulting, implementation for key entities (e.g., HCP, HCO, Employee, Payer, Product, Patient, Affiliations), and operational support including KTLO and Data Stewardship. With 50+ MDM implementations and Change Management programs annually for Life Sciences clients, the team has developed valuable assets like MDM libraries and pre-built accelerators. Strategic partnerships with leading platform vendors (Reltio, Informatica, Veeva, Semarchy etc) and collaborations with 18+ data vendors and technology providers further enhance their capabilities. You as Business Technology Solutions Manager will take ownership of one or more client delivery at a cross office level encompassing the area of digital experience transformation. The successful candidate will work closely with ZS Technology leadership and be responsible for building and managing client relationships, generating new business engagements, and providing thought leadership in the Digital Area. What Youll Do Lead the delivery process right from discovery/ POC to managing operations, across 3-4 client engagements helping to deliver world-class MDM solutions Ownership to ensure the proposed design/ architecture, deliverables meets the client expectation and solves the business problem with high degree of quality; Partner with Senior Leadership team and assist in project management responsibility i.e. Project planning, staffing management, people growth, etc.; Develop and implement master data management strategies and processes to maintain high-quality master data across the organization. Design and manage data governance frameworks, including data quality standards, policies, and procedures. Outlook for continuous improvement, innovation and provide necessary mentorship and guidance to the team; Liaison with Staffing partner, HR business partners for team building/ planning; Lead efforts for building POV on new technology or problem solving, Innovation to build firm intellectual capital: Actively lead unstructured problem solving to design and build complex solutions, tune to meet expected performance and functional requirements; Stay current with industry trends and emerging technologies in master data management and data governance. What Youll Bring: Bachelor's/Master's degree with specialization in Computer Science, MIS, IT or other computer related disciplines; 10-14 years of relevant consulting-industry experience (Preferably Healthcare bad Life Science) working on medium-large scale MDM solution delivery engagements: 5+ years of hands-on experience on designing, implementation MDM services capabilities using tools such as Informatica MDM, Reltio etc Strong understanding of data management principles, including data modeling, data quality, and metadata management. Strong understanding of various cloud based data management (ETL Tools) platforms such as AWS, Azure, Snowflake etc.,; Experience in designing and driving delivery of mid-large-scale solutions on Cloud platforms; Experience with ETL design and development, and (OLAP) tools to support business applications Additional Skills Ability to manage a virtual global team environment that contributes to the overall timely delivery of multiple projects; Knowledge of current data modeling, and data warehouse concepts, issues, practices, methodologies, and trends in the Business Intelligence domain; Experience with analyzing and troubleshooting the interaction between databases, operating systems, and applications; Significant supervisory, coaching and hands-on project management skills; Willingness to travel to other global offices as needed to work with client or other internal project teams.
Posted 1 month ago
15 - 20 years
40 - 100 Lacs
Bengaluru
Hybrid
Hiring, Sustainable, Client and Regulatory Reporting Data Product Owner - ISS Data (Associate Director) About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The Technology group is responsible for providing Technology solutions to the Investment Solutions & Services business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching Investment Solutions and Service strategy. About your role The Investment Reporting Data Product Owner role is instrumental in the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. The successful candidate will have an in- depth knowledge of all data domains that represent institutional clients , the investment life cycle , regulatory and client reporting data requirements. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned with our cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our internal business stakeholders and our clients. Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog , documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you The Investment Reporting Data Product Owner role is instrumental in the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. The successful candidate will have an in- depth knowledge of all data domains that represent institutional clients , the investment life cycle , regulatory and client reporting data requirements. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned with cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our internal business stakeholders and our clients. Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog , documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Your Skills and Experience Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years as a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Outstanding knowledge of Client life cycle covering institutional & wholesale with a focus on CRM data, Transfer agency data. Very good understanding of the data generated by investment management processes and how that is leveraged in Go-to market capabilities such as client reporting, Sales, Marketing. Excellent knowledge of regulatory environment with a focus on European regulations and ESG specific ones such as MIFID II, EMIR, SFDR. Work effortlessly in different operating models such as insourcing, outsourcing and hybrid models. Automation mindset that can drive efficiencies and quality in the reporting landscape. Knowledge of industry standard data calcs for fund factsheets, Institutional admin and investment reports would be an added advantage. In Depth expertise in data and calculations across the investment industry covering the below. Client Specific data: This includes institutional and wholesale client, account and channels data, client preferences and data sets needed for client analytics. Knowledge of Salesforce desirable. Transfer Agency & Platform data: This includes granular client holdings at various levels, client transactions and relevant ref data. Knowledge of role of TPAs as TA and integrating external feeds/products with strategic inhouse data platforms. Investment data: This includes investment life cycle data covering data domains such as trading, ABOR, IBOR, Security and fund reference. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance, and data engineering practices Hands on experience with data modelling techniques such as dimensional, data vault. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 1 month ago
15 - 20 years
40 - 100 Lacs
Bengaluru
Hybrid
Hiring, Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across our key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Essential Skills Required Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 1 month ago
3 - 8 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 1 month ago
5 - 10 years
15 - 20 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Job description About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Senior Data Analyst Experience: 5+ Years Skill Set: Data Analysis, SQL and Cloud (AWS, Azure, GCP) Location: Pune, Hyderabad, Gurgaon Key Requirements: Bachelors degree in Computer Science, MIS, or related fields. 6-7 years of relevant analytical experience, translating strategic vision into actionable requirements. Ability to conduct data analysis, develop and test hypotheses, and deliver insights with minimal supervision. Experience identifying and defining KPIs for business areas such as Sales, Consumer Behavior, Supply Chain, etc. Exceptional SQL skills. Experience with modern visualization tools like Tableau, Power BI, Domo, etc. Knowledge of open-source, big data, and cloud infrastructure such as AWS, Hive, Snowflake, Presto, etc. Incredible attention to detail with a structured problem-solving approach. Excellent communication skills (written and verbal). Experience with agile development methodologies. Experience in retail or e-commerce domains is a plus. How to Apply: Interested candidates can share their CV at pragati.jha@gspann.com.
Posted 1 month ago
5 - 10 years
7 - 12 Lacs
Kolkata
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary:As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities:- Design, build, and configure applications to meet business process and application requirements using Google BigQuery.- Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients.- Develop and maintain technical documentation, including design documents, test plans, and user manuals.- Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills:- Must To Have Skills:Proficiency in Google BigQuery.- Good To Have Skills:Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake.- Strong understanding of SQL and database design principles.- Experience with ETL tools and processes.- Experience with programming languages such as Python or Java. Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualifications 15 years or more of full time education
Posted 1 month ago
1 - 2 years
16 - 20 Lacs
Gurugram
Work from Office
Job Title - S&C Global Network - AI - Marketing (MMM) - Consultant Management Level: 9-Team Lead/Consultant Location: Bengaluru, BDC7C Must-have skills: Market Mix Modeling (MMM) Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. Must have knowledge of SQL language and at-least one cloud-based technologies (Azure, AWS, GCP) Must have good knowledge of Market mix modeling techniques and optimization algorithms and applicability to industry data Must have data migration experience from cloud to snowflake (Azure, GCP, AWS) Managing sets of XML, JSON, and CSV from disparate sources. Manage documentation of data models, architecture, and maintenance processes Have an understanding of econometric/statistical modeling and analysis techniques such as regression analysis, hypothesis testing, multivariate statistical analysis, time series techniques, optimization techniques, and statistical packages such as R, Python, Java, SQL, Spark etc. Working knowledge in Machine Learning algorithms like Random Forest, Gradient Boosting, Neural Network etc. Proficient in Excel, MS word, PowerPoint, etc. Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. WHAT'S IN IT FOR YOU? As part of our Analytics practice, you will join a worldwide network of over 20,000 smart and driven colleagues experienced in leading statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. Accenture will continually invest in your learning and growth. Youll work with MMM experts, and Accenture will support you in growing your own tech stack and certifications In Applied intelligence you will understands the importance of sound analytical decision-making, relationship of tasks to the overall project, and executes projects in the context of a business performance improvement initiative. What you would do in this role Working through the phases of project Define data requirements for creating a model and understand the business problem Clean, aggregate, analyze, interpret data and carry out quality analysis of it 3+ years of experience of Market Mix Modeling and related concepts of optimizing promotional channels and budget allocation Experience in working with nonlinear optimization techniques. Proficiency in Statistical and Probabilistic methods such as SVM, Decision-Trees, Bagging and Boosting Techniques, Clustering Hands on experience in python data-science and math packages such as NumPy, Pandas, Sklearn, Seaborne, Pycaret, Matplotlib Development of AI/ML models Develop and Manage data pipelines Develop and Manage Data within different layers of Snowflake Aware of common design patterns for scalable machine learning architectures, as well as tools for deploying and maintaining machine learning models in production. Knowledge of cloud platforms and usage for pipelining and deploying and scaling marketing mix models. Working along with the team and consultant/manager Looking for insight and creating a presentation to demonstrate these insights Supporting development and maintenance of proprietary marketing techniques and other knowledge development projects Basic level of task management knowledge and experience. Should be able to plan own tasks, discuss and work on priorities, track and report progress About Our Company | Accenture Qualifications Experience: 8 to 10 Years Educational Qualification: B.Com with 15 yrs edu
Posted 1 month ago
3 - 5 years
8 - 11 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Job Title: Data Engineer – Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: 3–5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams.
Posted 1 month ago
7 - 12 years
3 - 7 Lacs
Bengaluru
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Cloud Data Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Data ArchitectKemper is seeking a Data Architect to join our team. You will work as part of a distributed team and with Infrastructure, Enterprise Data Services and Application Development teams to coordinate the creation, enrichment, and movement of data throughout the enterprise. Your central responsibility as an architect will be improving the consistency, timeliness, quality, security, and delivery of data as part of Kemper's Data Governance framework. In addition, the Architect must streamline data flows and optimize cost management in a hybrid cloud environment. Your duties may include assessing architectural models and supervising data migrations across IaaS, PaaS, SaaS and on premises systems, as well as data platform selection and, on-boarding of data management solutions that meet the technical and operational needs of the company. To succeed in this role, you should know how to examine new and legacy requirements and define cost effective patterns to be implemented by other teams. You must then be able to represent required patterns during implementation projects. The ideal candidate will have proven experience in cloud (Snowflake, AWS and Azure) architectural analysis and management. Responsibilities Define architectural standards and guidelines for data products and processes. Assess and document when and how to use existing and newly architected producers and consumers, the technologies to be used for various purposes, and models of selected entities and processes. The guidelines should encourage reuse of existing data products, as well as address issues of security, timeliness, and quality. Work with Information & Insights, Data Governance, Business Data Stewards, and Implementation teams to define standard and ad-hoc data products and data product sets. Work with Enterprise Architecture, Security, and Implementation teams to define the transformation of data products throughout hybrid cloud environments assuring that both functional and non-functional requirements are addressed. This includes the ownership, frequency of movement, the source and destination of each step, how the data is transformed as it moves, and any aggregation or calculations. Working with Data Governance and project teams to model and map data sources, including descriptions of the business meaning of the data, its uses, its quality, the applications that maintain it and the technologies in which it is stored. Documentation of a data source must describe the semantics of the data so that the occasional subtle differences in meaning are understood. Defining integrative views of data to draw together data from across the enterprise. Some views will use data stores of extracted data and others will bring together data in near real time. Solutions must consider data currency, availability, response times and data volumes, etc. Working with modeling and storage teams to define Conceptual, Logical and Physical data views limiting technical debt as data flows through transformations. Investigating and leading participation in POCs of emerging technologies and practices. Leveraging and evolving existing [core] data products and patterns. Communicate and lead understanding of data architectural services across the enterprise. Ensure a focus on data quality by working effectively with data and system stewards. QualificationsBachelor's degree in computer science, Computer Engineering, or equivalent experience. A minimum of 3 years' experience in a similar role. Demonstrable knowledge of Secure DevOps and SDLC processes. Must have AWS or Azure experience.Experience with Data Vault 2 required. Snowflake a plus.Familiarity of system concepts and tools within an enterprise architecture framework. Including Cataloging, MDM, RDM, Data Lakes, Storage Patterns, etc. Excellent organizational and analytical abilities. Outstanding problem solver. Good written and verbal communication skills.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.