Infocepts enables improved business results through more effective use of data, AI & user-friendly analytics. We partner with our clients to resolve the most common & complex challenges standing in their way of using data to strengthen business decisions.
Pune, Maharashtra
INR Not disclosed
Work from Office
Full Time
Position: Analytics Architect (MicroStrategy) Purpose of the Position: Analytics Architect is responsible for architecting and building comprehensive data-focused solutions that adhere to business objectives. The architect must evaluate functional and technical business requirements and be able to transform them into conceptual/logical data models. You will be responsible for partnering with key business stakeholders to develop solutions that adhere to corporate architecture standards. To be successful in this role, one must be able to work effectively in a fast-paced agile environment and work with minimal supervision on multiple concurrent projects. Location: India (Nagpur, Pune preferred) Type of Employment: Full time Key Result Areas and Activities: The individual requires demonstrating expertise and results in the following KRAs: Discover & address unmet needs Architect solutions & coordinate their implementation Lead & support growth opportunities Research & experiment to innovate Build & reuse knowledge & expertise Nurture & grow technical talent Architect and Design Solutions ? Current state assessment for prescribing a D&A modernization strategy and data-culture transformation roadmap. Architect and Design modern, high-performance, secure, self-service, adoption driven analytics solutions incorporating people, process, and technology levers. Review analytics solutions and processes for optimization and business value acceleration. Provide Advisory and Consulting Services ? Drive consulting engagements in areas of expertise across projects/accounts. Support pre-sales activities including engaging prospects, conducting workshops, developing point of views/solutions/scoping, writing proposals, engaging partners, and supporting sales and marketing enablement. Systematically Develop and Maintain Offerings ? Support offering development and lead its pilot implementation. Create implementation methodologies, processes, templates are defined, enhanced, and internalized for the prioritized solutions. Essential Skills: Strong knowledge of MicroStrategy architecture and components for effective solutioning. Proficient in creating schema objects (attributes, facts, logical tables, hierarchies) and public objects (reports, metrics, filters, prompts, documents, dashboards) Proficient in developing complex reports using MicroStrategy Desktop and Web, including templates, filters, prompts, consolidations, and custom groups. Experience with MicroStrategy Transaction Services and ensuring robust user security Proficient in multiple analytics platforms such as MSTR with exposure on Tableau, Power BI and SSRS (any 1 at a minimum) Thorough understanding of relational database platforms and experience in designing & building dimensional data models Well versed with the latest trends in analytics, business intelligence and data visualization Desirable Skills: Knowledge of MicroStrategy report performance tuning and optimization techniques will act as value add Strong technical skills and thorough understanding of Data Warehouse Concepts Experience with Microsoft PowerQuery, PowerBI Well versed with the latest trends in analytics, business intelligence and data visualization Qualification: Bachelor?s degree in computer science, engineering, or related field (master's degree is a plus) Demonstrated continued learning through one or more technical certifications or related methods At least 8+ years of relevant experience; two years may be substituted for a master?s degree Qualities: Self-motivated and focused on delivering outcomes for a fast-growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Able to work with teams and clients in different time zones Research focused mindset and Experience with team management Location India Years Of Exp 10 to 15 years
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Purpose of the Position: Purpose of this role is to drive growth of Cloud & Data Engineering business at InfoCepts as a Solution Architect. Major responsibilities under this role would be to engage with clients in advisory & consulting, advance solutions, support presales as a solution architect, build capabilities, drive research & experimentation and nurture talent in cloud & data engineering. InfoCepts provides end-to-end data & analytics solutions to commercial and public sector customers in various markets across the globe with a concentration in the US, EMEA, and the APAC regions. InfoCepts invests in four core competencies— Cloud & Data Engineering (CDE), Analytics and Data Management (ADM), Business Consulting (BC) and Service Management (SM)—to enable continued access to global markets and unmet needs. Reporting to the Director of Cloud & Data Engineering Competency Center, this role would require collaboration with stakeholders internally within the organization such as practices, delivery, sales, marketing, talent fulfilment & development functions as well as clients and partners. Location: India Key Result Areas and Activities: Engage with clients, delivery & market – Engage with new and existing clients to provide advisory consulting and address unmet needs in cloud data technologies. Proactively track D&A trends and influence client’s data strategy and choices through thought leadership. Identify white spaces and mine opportunities for growth. Architect solutions & coordinate their implementation – Architect and design modern, high-performance D&A solutions incorporating people, process, and technology levers in cloud and big data technologies. Coordinate with delivery for adoption with focus on greater reuse, AI infusion for accelerated and risk-mitigated implementation. Lead & support opportunities - Support pre-sales activities including conducting discovery workshops, developing point of views/solutions/scoping, writing proposals, and supporting sales and marketing enablement for Cloud and Big Data related technologies. Guide research & experiments to address unmet needs - Proactively drive research and experimentation through the CoEs to ensure that Infocepts stays ahead of the curve in emerging areas of cloud data engineering and architecture. Identify specific opportunities for innovation to create differentiation and address unmet needs. Build & reuse knowledge & expertise – Build foundation components in the form of reusable assets in Cloud and Big Data technologies which can be leveraged across projects and client accounts. Advance InfoCepts solutions through effective use of one or more foundation components. Influence D&A space by creating thought leadership content and participating in external events and forums. Anchor and leverage Community of Practitioners (CoPs) for knowledge sharing and mobilizing support for innovation at scale. Nurture & grow talent – Develop differentiated D&A talent pool in through recruitment and learning & development in cloud data engineering competencies. Mentor associates to help them advance their careers in D&A using the InfoCept’s Career Architecture Framework. Work and Technical Experience: Must Have: Experience in D&A domain specifically in cloud data technologies Experience in solution design, architecture and implementation of modern hybrid or multi-cloud data analytics solutions Demonstrable presales and solutioning experience in conducting workshops, building proposals Proficiency in D&A stack of one or more hyperscaler platforms such as Azure, AWS, GCP and modern data platforms such as Snowflake and Databricks Demonstrable thought leadership through blogs, white papers and presentations in D&A events In-depth and hands on experience on data modeling. data organization and data engineering on cloud. Diverse D&A experiences including green field implementations, data modernization, D&A migrations from on -premise to cloud and development of full stack data products Experience of designing and implementing domain driven architecture, data mesh and data fabric architectural patterns Good to have: Well versed with data management practices such as data quality management, data cataloging and data governance in hybrid and multi-cloud environments Well versed in DataOps automation using custom or cloud native services on hyperscaler platforms Well versed with DevSecOps considerations for cloud first data analytics solutions Sound understanding of cloud infrastructure provision, data observability and Cloud FinOps Demonstrable advisory and consulting experiences for strategic D&A initiatives Sound business appreciation to understand business and functional requirements for solution architecture considerations Experience in evaluating and comparing multiple cloud data technology choices and making right-fit recommendations in the client context Working knowledge of frameworks like TOGAF, DAMA Qualifications: Bachelor’s degree in computer science, engineering, or related field (Master’s degree is a plus) Demonstrated continued learning through one or more technical certifications or related methods At least 14+ years of experience in data analytics with focus on cloud data technologies Qualities: Self-motivated and focused on delivering outcomes for a fast growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Able to work with teams and clients in different time zones Research focused mindset Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Position: Sr. Cloud Data Engineer (AWS-Big Data) Location: Nagpur/Pune/Chennai/Bangalore Purpose of the Position: As a Sr. Cloud Data Engineer, this position requires candidate who are enthusiastic about specialized skills in AWS Services and Big Data.As a member of the team, you will help our clients, by building Models that supports to progress on their AWS cloud journey. Key Result Areas and Activities: Share and Build Expertise- Develop and share expertise in cloud solutioning domain and actively mine the experience and expertise in the organization for sharing across teams and clients in the firm. Support the cloud COE initiatives. Nurture and Grow Talent- Provide support for recruitment, coaching and mentoring, and building practice capacity in the firm in Cloud. AWS Integration- Integrate various AWS services to create seamless workflows and data processing solutions. Data Pipeline Development- Design, build, and maintain scalable data pipelines using AWS services to support data processing and analytics. Essential Skills: Knowledge of following AWS Services required: S3, EC2, EMR, Severless, Athena, AWS Glue, Lambda, Step Functions Cloud Databases– AWS Aurora, Singlestore, RedShift, Snowflake Big Data- Hadoop, Hive, Spark, YARN Programming Language– Scala, Python, Shell Scripts, PySpark Operating System- Any flavor of Linux, Windows Strong SQL Skills Orchestration Tools: Apache Airflow Hands-on in developing ETL workflows comprising complex transformations like SCD, deduplications, aggregations, etc. Desirable Skills: Experience and Technical Knowledge in Databricks Strong Experience with event stream processing technologies such as Kafka, KDS Knowledge of Operating System- Any flavor of Linux, ETL Tools Informatica, Talend Experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR) will be a plus Qualifications: Overall 7-9 years of IT experience & 3+ years of AWS related project Bachelors degree in computer science, engineering, or related field (Masters degree is a plus) Demonstrated continued learning through one or more technical certifications or related methods Qualities: Hold strong technical knowledge and experience Should have the capability to deep dive and research in various technical related fields Self-motivated and focused on delivering outcomes for a fast growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Position: Lead Data Scientist (12-18 years) We are seeking a talented and experienced Data Scientist to join our growing team. The ideal candidate will have a strong background in supply chain and operations, with expertise in developing advanced data science solutions for anomaly detection, forecasting, and revenue optimization using convolutional neural networks (CNN), Python, and PySpark. Location: Chennai/Bangalore/Pune/Nagpur Roles & Responsibilities Collaborate with cross-functional teams to understand business requirements and identify opportunities for leveraging data science to drive revenue growth and improve operational efficiency in the supply chain and operations domain. Design, develop, and deploy advanced data science models and algorithms for traditional and GenAI business products, with a focus on CNN anomaly detection, forecasting, and revenue optimization. Analyse large and complex datasets to extract actionable insights and develop predictive models that provide valuable business intelligence and support data-driven decision-making processes. Implement scalable and efficient data processing pipelines using Python and PySpark to pre-process, clean, and transform raw data into a format suitable for analysis and modelling. Evaluate the performance of data science models using appropriate metrics and techniques, and iterate on model design and parameters to continuously improve accuracy and effectiveness. Collaborate with software engineers and DevOps teams to integrate data science models into production systems and ensure scalability, reliability, and performance. Stay up-to-date with the latest advancements in data science, machine learning, and artificial intelligence technologies, and proactively identify opportunities to apply new techniques and methodologies to solve business challenges. Essential Skills: Bachelor's degree or higher in Computer Science, Engineering, Mathematics, Statistics, or a related field. Advanced degree preferred. 10+ years of experience in data science, machine learning, and predictive analytics, with a focus on supply chain and operations. Strong proficiency in Python and PySpark for data manipulation, analysis, and modelling. Experience with libraries such as Pandas, NumPy, TensorFlow, and PyTorch is highly desirable. Solid understanding of convolutional neural networks (CNN) and deep learning techniques for anomaly detection, forecasting, and revenue optimization. Experience working with large-scale datasets and distributed computing frameworks for processing and analysing big data, such as Hadoop, Spark, and Databricks. Proven track record of developing and deploying data science solutions in production environments, with a focus on delivering measurable business impact and value. Excellent communication skills and ability to effectively collaborate with cross-functional teams, including business stakeholders, data engineers, software developers, and product managers. Strong analytical and problem-solving skills, with a passion for using data-driven approaches to solve complex business problems and drive innovation. Good to have This individual will be self-directed, highly motivated, and organized with strong analytical thinking and problem-solving skills, and have an ability to work on multiple projects and function in a team environment. Expertise in managing complex projects, ensuring timely delivery and adherence to budget. Qualifications: B.E/B.Tech/M.Tech qualification Qualities: Strong leadership and team management Excellent project management skills Effective communication and collaboration Analytical and strategic thinking Adaptability in multi-cultural environments Show more Show less
Chennai
INR 12.0 - 16.0 Lacs P.A.
Work from Office
Full Time
G6 Cloud Data Architect (Snowflake + DBT), this is for pro-active hiring
Nagpur, Pune, Bengaluru
INR 5.0 - 13.0 Lacs P.A.
Hybrid
Full Time
Role & responsibilities Position : Tableau Developer Purpose of the Position: Design, Develop, support and steer end-to-end business intelligence solution using Tableau. InfoCepts is a global leader of end-to-end data and analytics solutions with nearly 20 years experience enabling customers to derive value from a variety of data-driven capabilities. Unique among its peers, InfoCepts operates with the scale of a global consulting firm, yet the expertise of a niche partner. Work Location : Pune and Nagpur preferred Type of Employment: FTE Key Responsibilities: Business Requirements: Experience in providing analytics solutions while balancing architecture requirements, effort estimations, and customer-specific needs. Working with end-users to design and build dashboards and customizations to meet their requirements and suit their roles. Technical Translation: Designing and implementing Data Warehouses/Analytics Solutions. Defining and configuring the security model within Tableau deployments. Hands-on working experience with Tableau to author queries, datasets, visuals, and reports. Documentation: Preforming gap analyses, maturity assessments, and developing Analytics technology roadmaps. Analytical Skills: Excellent Data Modelling skills (RDBMS concepts, Normalization, dimensional modelling, star/snowflake schema, etc.). Well-versed with the latest trends in analytics, business intelligence, and data visualization. Capable of analytical technology assessment and strategic decision-making. Work and Technical Experience: Must Have 5+ years of hands-on experience in Tableau dashboard development, optimizing performance, and managing medium to complex dashboards (including Row-Level Security). Strong understanding of data connections, optimized models, relationships, joins, unions, data blending, and handling date/time calculations effectively. Skilled in writing optimized calculations, table calculations, cascading filters, and proficient SQL knowledge. Basic knowledge of Tableau admin activities (e.g., migrations, user/group additions, schedule updates) and experience in the Banking domain. Ability to lead development teams, resolve technical blockers, and develop reusable artifacts, frameworks, and industry solutions. Excellent written and verbal communication skills in English, suited for collaboration and requirement gathering. Good to have: Expertise in multiple analytics platforms such as PowerBI / MSTR Experience in developing reusable artifacts/frameworks, re-usable assets, industry solutions, etc. Experience in converting business requirements to mock-ups using tools like Figma Qualifications: Bachelors degree in computer science, engineering, or related field (master’s degree is a plus) Demonstrated continued learning through one or more relevant certifications or related methods At least 5+ years of relevant experience. Qualities: Self-motivated and focused on delivering outcomes for a fast-growing team. Strong interpersonal skills Able to work in a self-organized and cross-functional teams. Able to work with teams and clients in different time zones. Able to quickly acquire and develop new capabilities and skills. Preferred candidate profile Immediate Joiner Only
Nagpur, Pune
INR 8.0 - 15.0 Lacs P.A.
Work from Office
Full Time
Role & responsibilities Sr. QA Engineer Job Description Position Name : Sr. Quality Assurance Engineer Purpose of the Position: The purpose of the Sr. Quality Assurance Engineer position is to ensure the highest standards of quality in our software products through comprehensive testing and validation processes, with a particular expertise on Business Intelligence (BI) and Extract, Transform, Load (ETL) testing, along with proficiency in SQL. This role involves creating and executing test plans, validating data accuracy and integrity, and ensuring the implementation of business rules, functionality and performance of BI tools. The QA Engineer will also be responsible for verifying the ETL processes, including data extraction, transformation, and loading, to guarantee that data is reliable and accessible for reporting and analysis. By collaborating closely with development and business teams, the QA Engineer will enhance product quality and performance, contributing to the overall success of the organization. Location: Nagpur/Pune Type of Employment: Full-time Key Result Areas and Activities: Testing and QA Expertise : Minimum of 7-8 years in a Testing/QA role. Strong grasp of QA concepts and lifecycle methodologies. Test Planning and Execution : Review requirements, specifications, and technical design documents to provide timely and meaningful feedback. Create detailed, comprehensive, and well-structured test cases, including criteria for data, functional, and performance testing. Estimate, prioritize, plan, coordinate, and execute testing activities effectively. Process Development and Improvement : Develop and apply testing processes for new and existing projects to meet client needs. Troubleshoot quality issues and modify test procedures as necessary. Reporting and Collaboration : Provide status reports on test execution, including tracking tests and issues/bugs. Collaborate with business teams to finalize test plans and test cases. Technical Skills and Teamwork : Understand SQL concepts and perform data testing using SQL. Work efficiently as a team player with minimal supervision. Preferred some knowledge of Cloud, AWS Services. Roles & Responsibilities Essential Skills: Proficient understanding of QA processes and best practices. Strong experience with SQL and its variations among popular databases (Oracle/ Snowflake), including data testing. Hands-on experience with ETL and BI (MSTR) testing. Ability to understand business / domain, functional knowledge. Able to understand and work with business/onsite stakeholders for requirements. Ability to create detailed and comprehensive test cases for data, functional, and performance testing. Familiarity with automation testing tools and frameworks is a plus.Basic knowledge of scripting languages (e.g., Python, Shell) for automation purposes. Desirable Skills: Experience in Agile frameworks (Scrum, Kanban) and experience with tools like Jira. Familiarity with additional BI tools beyond Mstr such as MicroStrategy/ Tableau. Understanding of data warehousing principles and architecture. Nice to have basic knowledge of cloud services. Qualifications: Bachelor?s degree in computer science, Information Technology or related field. Minimum of 7+ years of experience in a Testing/QA role, with a focus on BI and ETL testing. Qualities: Strong focus on accuracy and thoroughness in testing processes and documentation. Ability to think critically and analyse complex data to identify issues and improvement opportunities. Willingness to learn new tools, technologies, and processes to keep up with industry trends. Collaborative attitude with a willingness to support team members and contribute to a positive team environment. Ability to work independently with minimal supervision while managing multiple tasks efficiently. Preferred candidate profile : Immediate Joiner Only
Chennai, Tamil Nadu
Not disclosed
On-site
Not specified
G6 Cloud Data Architect (Snowflake + DBT), this is for pro-active hiring Location Chennai
Chennai
INR 4.62 - 9.82 Lacs P.A.
On-site
Part Time
G6 Cloud Data Architect (Snowflake + DBT), this is for pro-active hiring Location Chennai
Nagpur, Pune, Bengaluru
INR 7.0 - 15.0 Lacs P.A.
Hybrid
Full Time
Role & responsibilities Tableau Architecture Troubleshooting SAML and SSL activities Tableau Upgrade Preferred candidate profile :Immediate Joiner Only
Chennai
INR 13.0 - 17.0 Lacs P.A.
Work from Office
Full Time
InfoCepts is looking for Data Architect- Snowflake & DBT to join our dynamic team and embark on a rewarding career journey Design and Development: Create and implement data warehouse solutions using Snowflake, including data modeling, schema design, and ETL (Extract, Transform, Load) processes Performance Optimization: Optimize queries, performance-tune databases, and ensure efficient use of Snowflake resources for faster data retrieval and processing Data Integration: Integrate data from various sources, ensuring compatibility, consistency, and accuracy Security and Compliance: Implement security measures and ensure compliance with data governance and regulatory requirements, including access control and data encryption Monitoring and Maintenance: Monitor system performance, troubleshoot issues, and perform routine maintenance tasks to ensure system health and reliability Collaboration: Collaborate with other teams, such as data engineers, analysts, and business stakeholders, to understand requirements and deliver effective data solutions Skills and Qualifications:Snowflake Expertise: In-depth knowledge and hands-on experience working with Snowflake's architecture, features, and functionalities SQL and Database Skills: Proficiency in SQL querying and database management, with a strong understanding of relational databases and data warehousing concepts Data Modeling: Experience in designing and implementing effective data models for optimal performance and scalability ETL Tools and Processes: Familiarity with ETL tools and processes to extract, transform, and load data into Snowflake Performance Tuning: Ability to identify and resolve performance bottlenecks, optimize queries, and improve overall system performance Data Security and Compliance: Understanding of data security best practices, encryption methods, and compliance standards (such as GDPR, HIPAA, etc) Problem-Solving and Troubleshooting: Strong analytical and problem-solving skills to diagnose and resolve issues within the Snowflake environment Communication and Collaboration: Good communication skills to interact with cross-functional teams and effectively translate business requirements into technical solutions Scripting and Automation: Knowledge of scripting languages (like Python) and experience in automating processes within Snowflake
Bengaluru
INR 16.0 - 20.0 Lacs P.A.
Work from Office
Full Time
Key Result Areas: Architect modern DA solutions using best of breed cloud services specifically from GCP aligned to client needs and drive implementation for successful delivery Demonstrate expertise for client success through delivery support and thought leadership cloud data architecture with focus on GCP data analytics services. Contribute to business growth through presales support for GCP based solutions Research experiment to address unmet needs through innovation Build reuse knowledge, expertise foundational components for cloud data architecture, data engineering specifically on GCP Grow nurture technical talent within the Infocepts GCP community of practice Must-Have Skills Deep hands-on experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer. Proven ability to design enterprise-grade data lakes, warehouses, and real-time data systems. Strong command of Python and SQL for data engineering and automation tasks. Expertise in building and managing complex ETL/ELT pipelines using tools like Apache Beam or Airflow. Experience in leading teams, conducting code reviews, and engaging with senior stakeholders. Good-to-Have Skills Familiarity with Terraform or Deployment Manager for GCP resource provisioning. Experience with Kafka, Apache Beam, or similar technologies. Knowledge of data lineage, cataloging, encryption, and compliance frameworks (e.g., GDPR, HIPAA). Exposure to integrating data pipelines with ML models and Vertex AI. Understanding of Looker, Tableau, or Power BI for data consumption. Qualifications: Overall work experience of 12+ years with minimum of 3 to 6 years experience GCP related projects BS Degree in IT, MIS or business-related functional discipline Experience with or knowledge of Agile Software Development methodologies
Bengaluru, Karnataka
Not disclosed
On-site
Not specified
Key Result Areas: Architect modern D&A solutions using best of breed cloud services specifically from GCP aligned to client needs and drive implementation for successful delivery Demonstrate expertise for client success through delivery support and thought leadership cloud data architecture with focus on GCP data & analytics services. Contribute to business growth through presales support for GCP based solutions Research & experiment to address unmet needs through innovation Build & reuse knowledge, expertise & foundational components for cloud data architecture, data engineering specifically on GCP Grow & nurture technical talent within the Infocepts GCP community of practice Must-Have Skills Deep hands-on experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer. Proven ability to design enterprise-grade data lakes, warehouses, and real-time data systems. Strong command of Python and SQL for data engineering and automation tasks. Expertise in building and managing complex ETL/ELT pipelines using tools like Apache Beam or Airflow. Experience in leading teams, conducting code reviews, and engaging with senior stakeholders. Good-to-Have Skills Familiarity with Terraform or Deployment Manager for GCP resource provisioning. Experience with Kafka, Apache Beam, or similar technologies. Knowledge of data lineage, cataloging, encryption, and compliance frameworks (e.g., GDPR, HIPAA). Exposure to integrating data pipelines with ML models and Vertex AI. Understanding of Looker, Tableau, or Power BI for data consumption. Qualifications: Overall work experience of 12+ years with minimum of 3 to 6 years’ experience GCP related projects BS Degree in IT, MIS or business-related functional discipline Experience with or knowledge of Agile Software Development methodologies Location Bangalore Years Of Exp 9 to 13 years
Mumbai
INR 6.0 - 10.0 Lacs P.A.
Work from Office
Full Time
Senior Azure Data Engineer ? L1 Support
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
INR 6.0 - 7.0 Lacs P.A.
Work from Office
Full Time
Position: Data Engineer - MS Fabric Purpose of the Position: As an MS Fabric Data engineer you will be responsible for designing, implementing, and managing scalable data pipelines. Strong experience in implementation and management of lake House using MS Fabric Azure Tech stack (ADLS Gen2, ADF, Azure SQL) . Proficiency in data integration techniques, ETL processes and data pipeline architectures. Well versed in Data Quality rules, principles and implementation. Location: Bangalore/ Pune/ Nagpur/ Chennai Type of Employment: FTE Key Result Areas and Activities: 1. Data Pipeline Development Optimization Design and implement data pipelines using MS Fabric. Manage and optimize ETL processes for data extraction, transformation, and loading. Conduct performance tuning for data storage and retrieval to enhance efficiency. 2. Data Quality, Governance Documentation Ensure data quality and integrity across all data processes. Assist in designing data governance frameworks and policies. Generate and maintain documentation for data architecture and data flows. 3. Cross-Functional Collaboration Requirement Gathering Collaborate with cross-functional teams to gather and define data requirements. Translate functional and non-functional requirements into system specifications. 4. Technical Leadership Support Provide technical guidance and support to junior data engineers. Participate in code reviews and ensure adherence to coding standards. Troubleshoot data-related issues and implement effective solutions. Technical Experience: Must Have: Proficient in MS Fabric, Azure Data Factory, and Azure Synapse Analytics with deep knowledge of Fabric components like writing Notebook, Lakehouses, OneLake, Data Pipelines, and Real-Time Analytics. Skilled in integrating Fabric capabilities for seamless data flow, governance, and cross-team collaboration. Strong grasp of Delta Lake, Parquet, distributed data systems, and various data formats (JSON, XML, CSV, Parquet). Experienced in ETL/ELT processes, data warehousing, data modeling, and data quality frameworks. Proficient in Python, PySpark, Scala, Spark SQL, and T-SQL for complex data transformations. Familiar with Agile methodologies and tools like JIRA, with hands-on experience in monitoring tools and job scheduling. Good To Have: Familiarity with Azure cloud platforms and cloud data services MS Purview, Open Source libraries like Dequee, Pydequee, Great Expectation for DQ implementation Develop data models to support business intelligence and analytics Experience with PowerBI dashboard Experience with Databricks Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or a related field 5+ years of experience in MS Fabric/ADF/Synapse Qualities: Experience with or knowledge of Agile Software Development methodologies . Able to consult, write, and present persuasively.
Chennai, Tamil Nadu
Not disclosed
On-site
Full Time
Position: Database Admin- Redshift Purpose of the Position: You will be a critical member of the Infocepts Cloud Data Administrator Team. This position requires a deep understanding of Amazon Redshift, database performance tuning, and optimization techniques. Strong foundation in database concepts, SQL, and experience with AWS services is essential. Location: Nagpur/Pune/Bangalore/Chennai Type of Employment: Full-time Key Result Areas and Activities: Design and Development: Design, implement, and manage Redshift clusters for high availability, performance, and security. Performance Optimization: Monitor and optimize database performance, including query tuning and resource management. Backup and Recovery: Develop and maintain database backup and recovery strategies. Security Enforcement: Implement and enforce database security policies and procedures. Cost-Performance Balance: Ensure an optimal balance between cost and performance. Collaboration with Development Teams: Work with development teams to design and optimize database schemas and queries. Perform database migrations, upgrades, and patching. Issue Resolution: Troubleshoot and resolve database-related issues, providing support to development and operations teams. Automate routine database tasks using scripting languages and tools. Continuous Learning: Stay updated with the latest Redshift features, best practices, and industry trends. Deliver technology-focused training sessions and conduct expert knowledge sharing with client stakeholders as needed. Documentation and Proposals: Assist in designing case study documents and collaborate with Centre of Excellence/Practice teams on proposals. Mentorship and Recruitment: Mentor and groom junior DBAs and participate in conducting interviews for the organization. Value-Added Improvements: Propose improvements to the existing database landscape. Product Team Collaboration: Collaborate effectively with product teams to ensure seamless integration and performance. Essential Skills: Strong understanding of database design, performance tuning, and optimization techniques Proficiency in SQL and experience with database scripting languages (e.g., Python, Shell) Experience with database backup and recovery, security, and high availability solutions Familiarity with AWS services and tools, including S3, EC2, IAM, and CloudWatch Operating System - Any flavor of Linux, Windows Desirable Skills: Knowledge of other database systems (e.g., Snowflake, SingleStore, PostgreSQL, MySQL) AWS Certified Database - Specialty or other relevant certifications Prior experience of working in a large media company would be added advantage Qualifications: Bachelor?s degree in computer science, engineering, or related field (Master?s degree is a plus) 7-10 years of experience as a Database Administrator, with at least 5 years of experience specifically with Amazon Redshift Demonstrated continued learning through one or more technical certifications or related methods Experience with data warehousing concepts and ETL processes Qualities: Should be a quick and self-learner and be ready to adapt to new technologies as and when required Should have the capability to deep dive and research in various technical related fields Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Location India Years Of Exp 7 to 10 years
Chennai
INR 5.0 - 8.0 Lacs P.A.
On-site
Full Time
Position: Database Admin- Redshift Purpose of the Position: You will be a critical member of the Infocepts Cloud Data Administrator Team. This position requires a deep understanding of Amazon Redshift, database performance tuning, and optimization techniques. Strong foundation in database concepts, SQL, and experience with AWS services is essential. Location: Nagpur/Pune/Bangalore/Chennai Type of Employment: Full-time Key Result Areas and Activities: Design and Development: Design, implement, and manage Redshift clusters for high availability, performance, and security. Performance Optimization: Monitor and optimize database performance, including query tuning and resource management. Backup and Recovery: Develop and maintain database backup and recovery strategies. Security Enforcement: Implement and enforce database security policies and procedures. Cost-Performance Balance: Ensure an optimal balance between cost and performance. Collaboration with Development Teams: Work with development teams to design and optimize database schemas and queries. Perform database migrations, upgrades, and patching. Issue Resolution: Troubleshoot and resolve database-related issues, providing support to development and operations teams. Automate routine database tasks using scripting languages and tools. Continuous Learning: Stay updated with the latest Redshift features, best practices, and industry trends. Deliver technology-focused training sessions and conduct expert knowledge sharing with client stakeholders as needed. Documentation and Proposals: Assist in designing case study documents and collaborate with Centre of Excellence/Practice teams on proposals. Mentorship and Recruitment: Mentor and groom junior DBAs and participate in conducting interviews for the organization. Value-Added Improvements: Propose improvements to the existing database landscape. Product Team Collaboration: Collaborate effectively with product teams to ensure seamless integration and performance. Essential Skills: Strong understanding of database design, performance tuning, and optimization techniques Proficiency in SQL and experience with database scripting languages (e.g., Python, Shell) Experience with database backup and recovery, security, and high availability solutions Familiarity with AWS services and tools, including S3, EC2, IAM, and CloudWatch Operating System - Any flavor of Linux, Windows Desirable Skills: Knowledge of other database systems (e.g., Snowflake, SingleStore, PostgreSQL, MySQL) AWS Certified Database - Specialty or other relevant certifications Prior experience of working in a large media company would be added advantage Qualifications: Bachelor?s degree in computer science, engineering, or related field (Master?s degree is a plus) 7-10 years of experience as a Database Administrator, with at least 5 years of experience specifically with Amazon Redshift Demonstrated continued learning through one or more technical certifications or related methods Experience with data warehousing concepts and ETL processes Qualities: Should be a quick and self-learner and be ready to adapt to new technologies as and when required Should have the capability to deep dive and research in various technical related fields Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Location India Years Of Exp 7 to 10 years
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Position : Technical Project Manager Purpose of the Position: As Technical Project Manager responsible for driving InfoCepts’ client engagement, high quality customer experience and building opportunities for growth. InfoCepts provides end-to-end data & analytics solutions to commercial and public sector customers in various markets across the globe with a concentration in the US, EMEA, and the APAC regions. InfoCepts invests in four core competencies— Cloud and Data Engineering (CDE), Analytics and Data Management (ADM), Business Consulting (BC) and Service Management (SM)—to enable continued access to global markets and unmet needs. The position is responsible to build-out and roll-out solutions, lead advisory engagements, respond to proposals, develop & manage the center of excellence, lead technology roadmaps, in-house capability building, and market research activities. Location: Chennai, Bangalore preferred Key Result Areas and Activities: Advisory and Consulting Services: Drive and execute consulting engagements in Cloud and Data Engineering, Data Integration and Virtualization, Data Science, and Advanced Analytics for clients. Support pre-sales activities including engaging prospects, conducting workshops, developing solutions, writing proposals, and supporting sales and marketing enablement. Technology Innovation and Evaluation: Proactively maintain market trends, drive innovations, and conduct technology evaluations. Engage teams and customers to identify top challenges and lead key engagements to co-create and deliver solutions. Offering Development and Implementation: Support the development of new offerings and lead their pilot implementation. Create and enhance implementation methodologies, processes, and templates for prioritized solutions. Performance Optimization and Agile Practices: Enhance value flow through value streams, DevOps practices, and Continuous Delivery Pipeline. Utilize Agile methodologies and tools like JIRA for project and resource management. Strategic Leadership and Risk Management: Provide thought leadership in the market and support GTM partnerships with channel partners. Lead prioritized initiatives and manage risks and dependencies effectively. Essential Skills: Translate business priorities into actionable plans, manage growth and delivery, including capacity planning, revenue, margins, and utilization. Requires account mining, customer relationship building, and networking. Establish long-term relationships with customer stakeholders, understand their strategic priorities, and collaborate on sales pitches, demos, and proposals for account expansion. Ensure effective client onboarding, maintain high performance and governance standards, lead program kick-offs and milestone reviews, and adhere to delivery excellence through proactive reviews and risk mitigation. Oversee staffing and team allocation, promote rotation, growth, mentoring, and retention of team members. Guide and manage teams, set and monitor performance targets, address issues, foster a culture of performance, ensure personal growth, and develop team capabilities. Possess deep knowledge of Data, Analytics, and AI. Desirable Skills: Experience with delivering data engineering projects Experience with Project Management tools like Microsoft Sharepoint, Spreadsheets etc. Experience with scrum and agile Excellent communication Qualifications: Bachelor’s degree in business, information technology or related field 10+ years of experience in Project Management, Stakeholder management and Presales Qualities : Strong people leadership skills and ability to engage, drive and mentor large teams and senior team members Experience in consultative partnering with customer stakeholders at executive level to drive business growth Able to communicate persuasively through written, verbal and client presentations. Effective and self-organized in cross-functional team and across different time zones. Show more Show less
Pune
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Position: Data Engineer - Azure Synapse Purpose of the Position : As an Azure Synapse Developer you will be responsible for designing, implementing, and managing scalable data pipelines. InfoCepts is a global leader of end-to-end data and analytics solutions with nearly 20 years experience enabling customers to derive value from a variety of data-driven capabilities. Unique among its peers, InfoCepts operates with the scale of a global consulting firm, yet the expertise of a niche partner. At InfoCepts, you ll be challenged to think innovatively, while growing your own personal and professional skills designed with the future in mind. We have more than 1200 global professionals working on cutting edge technology solutions with a single mission - transforming our customers journey with data-driven modernization. InfoCepts has been recognized as Gartner Peer Insights Customers Choice for two consecutive years in 2020 and 2021 which recognizes our best-in-class services that help our customers approach any data analytics problem. We are certified by Great Place to Work , India in 2021 and recognizes our high-trust and highperformance work culture, and great employee experience in the industry. Our award-winning reusable solutions approach is well recognized in the D&A industry and lets our Data Engineer -Azure Synapse Version 1.0 Client Logo Your title will go here, Use H1 as a header title from styles 23-01-2023 Proposal Ref No. I Version 1.0 Classification: Private Page 2 of 3 associates leverage our collective and proven consultative expertise, accelerate solution delivery through automation, enable faster time to value with reusable toolkits - all while delivering exceptional customer experience. Our success has been unique, and we are looking for professionals who are enthusiastic and passionate about data & analytics, delivering differentiated experiences, and solving real world problems for our global customers. This position works with the competency and delivery team to execute on the assigned responsibilities. Location: Bangalore/Pune/Nagpur/ Chennai Type of Employment: Full Time Regular Key Result Areas and Activities: Design, develop and deploy ETL/ELT solutions on premise or in the cloud Transformation of data with stored procedures. Report Development (MicroStrategy/Power BI) Create and maintain comprehensive documentation for data pipelines, configurations, and processes Ensure data quality and integrity through effective data management practices Monitor and optimize data pipeline performance Troubleshoot and resolve data-related issues Technical Experience: Must Have Good experience in Azure Synapse Good experience in ADF Good experience in Snowflake & Stored Procedures Experience with ETL/ELT processes, data warehousing, and data modelling Experience with data quality frameworks, monitoring tools, and job scheduling Knowledge of data formats like JSON, XML, CSV, and Parquet English Fluent (Strong written, verbal, and presentation skills) Agile methodology & tools like JIRA Good communication and formal skills Good To Have Good experience in MicroStrategy and PowerBI Experience in scripting languages such as Python, Java, or Shell scripting Familiarity with Azure cloud platforms and cloud data services and 4+ years of experience in Azure Synapse. Qualities: Experience with or knowledge of Agile Software Development methodologies Can influence and implement change; demonstrates confidence, strength of conviction and sound decisions. Believes in head-on dealing with a problem; approaches in logical and systematic manner; is persistent and patient; can independently tackle the problem, is not over-critical of the factors that led to a problem and is practical about it; follow up with developers on related issues. Able to consult, write, and present persuasively.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
INR 9.0 - 13.0 Lacs P.A.
Work from Office
Full Time
Position: Database Admin- Redshift Purpose of the Position: You will be a critical member of the Infocepts Cloud Data Administrator Team. This position requires a deep understanding of Amazon Redshift, database performance tuning, and optimization techniques. Strong foundation in database concepts, SQL, and experience with AWS services is essential. Location: Nagpur/Pune/Bangalore/Chennai Type of Employment: Full-time Key Result Areas and Activities: Design and Development: Design, implement, and manage Redshift clusters for high availability, performance, and security. Performance Optimization: Monitor and optimize database performance, including query tuning and resource management. Backup and Recovery: Develop and maintain database backup and recovery strategies. Security Enforcement: Implement and enforce database security policies and procedures. Cost-Performance Balance: Ensure an optimal balance between cost and performance. Collaboration with Development Teams: Work with development teams to design and optimize database schemas and queries. Perform database migrations, upgrades, and patching. Issue Resolution: Troubleshoot and resolve database-related issues, providing support to development and operations teams. Automate routine database tasks using scripting languages and tools. Continuous Learning: Stay updated with the latest Redshift features, best practices, and industry trends. Deliver technology-focused training sessions and conduct expert knowledge sharing with client stakeholders as needed. Documentation and Proposals: Assist in designing case study documents and collaborate with Centre of Excellence/Practice teams on proposals. Mentorship and Recruitment: Mentor and groom junior DBAs and participate in conducting interviews for the organization. Value-Added Improvements: Propose improvements to the existing database landscape. Product Team Collaboration: Collaborate effectively with product teams to ensure seamless integration and performance. Essential Skills: Strong understanding of database design, performance tuning, and optimization techniques Proficiency in SQL and experience with database scripting languages (e.g., Python, Shell) Experience with database backup and recovery, security, and high availability solutions Familiarity with AWS services and tools, including S3, EC2, IAM, and CloudWatch Operating System - Any flavor of Linux, Windows Desirable Skills: Knowledge of other database systems (e.g., Snowflake, SingleStore, PostgreSQL, MySQL) AWS Certified Database - Specialty or other relevant certifications Prior experience of working in a large media company would be added advantage Qualifications: Bachelor?s degree in computer science, engineering, or related field (Master?s degree is a plus) 7-10 years of experience as a Database Administrator, with at least 5 years of experience specifically with Amazon Redshift Demonstrated continued learning through one or more technical certifications or related methods Experience with data warehousing concepts and ETL processes Qualities: Should be a quick and self-learner and be ready to adapt to new technologies as and when required Should have the capability to deep dive and research in various technical related fields Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback
FIND ON MAP
Company Reviews
View ReviewsBrowse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.