Jobs
Interviews

1016 Etl Process Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

13 - 18 Lacs

Noida

Work from Office

Role Partner with stakeholders to understand data requirements. Producing and managing the delivery of activity and value analytics to external stakeholders and clients. Team members will typically use business intelligence, data visualization, and query analytic and statistical software to build solutions, perform analysis and interpret data. Primary Job Responsibilities Include Responsible for the management and manipulation of mostly structured data. Delivering Business Intelligence solutions using Agile software development process. Conducting analysis, performing normalization operations and assuring data quality. Utilize Python and its libraries (scikit-learn, Pandas, NumPy, TensorFlow, PyTorch) for data processing and model development. Design, build, and optimize AI models and generative AI applications. Create and fine-tune prompts for generative AI systems, including zero-shot, few-shot, and chain-of-thought approaches. Implement and manage RAG modeling using vector databases (Milvus, Chroma, FAISS). Creating specifications to bring data into a common structure. Creating product specifications and reporting models. Performing data analysis and interpreting data/business insights. Develop and maintain data preparation and validation routines to support data mining. Analytical - Synthesizes complex or diverse information, collects and researches data, uses intuition and experience to complement data, design workflows and procedures. Developing actionable insights and presenting recommendations for use across the internal/external stakeholder and leadership. Establish, refine and integrate development and test environment tools and software as needed. Identify production and non-production application issues. Ability to quickly analyze existing code, identify performance/reliability/scalability issues, propose solutions, and code optimization/updates as required. Provide technical support and consultation for database and infrastructure questions. Skills Required Must have extensive exposure in data analysis and business insights. Experience in database design, programming, testing and implementation using SQL with standard practices and procedures. Experience in writing TSQL statements, stored procedures and views using industry best practices for security and efficiency. On-the job experience working with relational database management systems like Microsoft SQL Server, Oracle etc. Must have extensive hands-on experience in ETL methodologies. Must have experience in building business intelligence solutions preferably using Snowflake. 1-2 years of experience in AI and machine learning . Understanding of LLM, Transformer, AI frameworks and libraries such as Tensorflow, PyTorch or similar Good to have experience in building data pipelines using Azure data integration tools. Experience and understanding on database performance tuning, database Management, requirements analysis, software development fundamentals. Ability to conduct analysis to ensure quality of data at various stages of its evolution. Proficiency in end-to-end analytics. Excellent problem solving, documentation skills, written & verbal communication skills. Excellent data Maintenance, database Security, promoting process improvement skills. Ability to drive for excellence with a strong sense of urgency. Ability to prioritize effectively. Ability to get to the bottom of issues (root-cause). Ability to be a creative problem-solver. Ability to build relationships with peers, team members and business partners. Ability to guide the planning and mitigation of critical tasks and timelines and align them to the over-all strategy. Should be able to work independently with minimum support/mentoring. Willing to learn new technology. Should be open for development and support. Able to write user and technical documentation. Hands on experience in dealing with global customers to gather requirements and translate them into solutions using the necessary skills. Perform proof of concept analyses. Experience in investigating and responding to routine or standard requests. Experience in solving complex problems and developing innovative solutions. General This is a high visibility role requiring high energy due to the constant interaction with the customers. Good to have a candidate with following abilities:- Self-driven, ability to work independently and facilitate change. Ability to work in a fast-paced, technical, cross-functional environment. Ability to work on projects from inception to completion. Excellent critical thinking and analytical skills. Comfortable working in a high-paced/high production area. Desire and ability to learn new skills, systems and processes. Anticipate customer needs and proactively develop solutions to meet them. Qualifications Bachelor’s Degree in MIS, Statistics, Mathematics, Computer Science, Business or a related field. Minimum 3 years hands-on experience in DWBI domain and usage of BI tools. Minimum 3 years hands-on experience in database management. Preferred Experience in the health care industry. priority reflected in our mission.

Posted 2 months ago

Apply

7.0 - 11.0 years

18 - 22 Lacs

Noida

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The Senior CMDB Architect plays a critical Senior role in overseeing the strategic design and maintenance of the ServiceNow Configuration Management Database (CMDB). This senior position is responsible for architecting robust ETL processes, comprehensive data analysis frameworks, and ensuring the overall integrity and effectiveness of the CMDB within large-scale enterprise environments. Primary Responsibilities Lead the strategic design and continuous enhancement of the ServiceNow CMDB architecture Design, implement, and oversee complex ETL processes tailored for optimal data integration and system efficiency Conduct high-level audits and develop advanced data validation techniques to ensure utmost accuracy and reliability of CMDB data Spearhead collaboration with senior IT leadership and cross-functional teams to ensure alignment of the CMDB with business objectives and IT infrastructure changes Provide expert guidance and mentorship to CMDB analysts and other IT staff on best practices, advanced troubleshooting, and complex issues resolution Drive innovation in CMDB processes through the integration of cutting-edge technologies and methodologies Develop and enforce governance policies to maintain data integrity and compliance across the CMDB lifecycle Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 10+ years of experience in a similar role, with at least 5+ years in a leadership or architectural capacity Extensive experience in ServiceNow CMDB architecture and administration, with a proven track record of managing CMDB in a large-scale, distributed environment Deep expertise in SQL, PL-SQL, and ETL process design, with a solid emphasis on performance optimization and scalability Advanced knowledge of modern, open-source technologies and their deployment in enterprise settings Proven masterful analytical skills and the ability to synthesize complex data into actionable insights Demonstrated leadership in developing, integrating, and deploying sophisticated database solutions Demonstrated exceptional problem-solving abilities and capacity to work on multiple projects and issues simultaneously Demonstrated solid communication and interpersonal skills, with experience in influencing C-suite executives and fostering a collaborative team environment Preferred Qualification Relevant certifications in ServiceNow, Database Management, or a related field This senior role demands a visionary leader who can maintain technological excellence and drive the strategic goals of our CMDB initiatives while ensuring alignment with the broader IT and business strategies.

Posted 2 months ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Gurugram

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities As a Senior Data Engineering Analyst, you will be instrumental in driving our data initiatives and enhancing our data infrastructure to support strategic decision-making and business operations. You will lead the design, development, and optimization of complex data pipelines and architectures, ensuring the efficient collection, storage, and processing of large volumes of data from diverse sources. Leveraging your advanced expertise in data modeling and database management, you will ensure that our data systems are scalable, reliable, and optimized for high performance A core aspect of your role will involve developing and maintaining robust ETL (Extract, Transform, Load) processes to facilitate seamless data integration and transformation, thereby supporting our analytics and reporting efforts. You will implement best practices in data warehousing and data lake management, organizing and structuring data to enable easy access and analysis for various stakeholders across the organization. Ensuring data quality and integrity will be paramount; you will establish and enforce rigorous data validation and cleansing procedures to maintain high standards of accuracy and consistency within our data repositories In collaboration with cross-functional teams, including data scientists, business analysts, and IT professionals, you will gather and understand their data requirements, delivering tailored technical solutions that align with business objectives. Your ability to communicate complex technical concepts to non-technical stakeholders will be essential in fostering collaboration and ensuring alignment across departments. Additionally, you will mentor and provide guidance to junior data engineers and analysts, promoting a culture of continuous learning and professional growth within the data engineering team Take a proactive role in performance tuning and optimization of our data systems, identifying and resolving bottlenecks to enhance efficiency and reduce latency. Staying abreast of the latest advancements in data engineering technologies and methodologies, you will recommend and implement innovative solutions that drive our data capabilities forward. Your strategic input will be invaluable in planning and executing data migration and integration projects, ensuring seamless transitions between systems with minimal disruption to operations Maintaining comprehensive documentation of data processes, architectural designs, and technical specifications will be a key responsibility, supporting knowledge sharing and maintaining organizational standards. You will generate detailed reports on data quality, system performance, and the effectiveness of data engineering initiatives, providing valuable insights to inform strategic decisions. Additionally, you will oversee data governance protocols, ensuring compliance with relevant data protection regulations and industry standards, thereby safeguarding the integrity and security of our data assets Leadership and expertise will contribute significantly to the enhancement of our data infrastructure, enabling the organization to leverage data-driven insights for sustained growth and competitive advantage. By fostering innovation, ensuring data excellence, and promoting best practices, you will play a critical role in advancing our data engineering capabilities and supporting the overall success of the business Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field Experience5+ years in data engineering, data analysis, or a similar role with a proven track record Technical Skills: Advanced proficiency in SQL and experience with relational databases (Oracle, MySQL, SQL Server) Expertise in ETL processes and tools Solid understanding of data modeling, data warehousing, and data lake architectures Proficiency in programming languages such as Python or Java Familiarity with cloud platforms (Azure Platform) and their data services Knowledge of data governance principles and data protection regulations (GDPR, HIPAA, CCPA) Soft Skills: Proven excellent analytical and problem-solving abilities Solid communication and collaboration skills Leadership experience and the ability to mentor junior team members Proven proactive mindset with a commitment to continuous learning and improvement Preferred Qualifications Relevant certifications Experience with version control systems (Git) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

5.0 - 9.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Job Summary Synechron is seeking an experienced ETL+Functional+ Lead to join our innovative team. This position is vital for designing, developing, and executing automation scripts, as well as leading functional and regression testing processes. The ETL+Functional+ Lead will work collaboratively with development and QA teams to ensure the timely delivery of high-quality software solutions, significantly contributing to Synechrons business objectives. Software Requirements Required Proficiency: Selenium, TestNG, Maven, Jenkins, JIRAAdvanced level Proficiency in ETL (Extract, Transform, Load) processes and Functional testingAdvanced level API testingProficient Database testingProficient Preferred Proficiency: Knowledge of Agile methodologies, including Scrum Overall Responsibilities Design, develop, and execute automation scripts for ETL and functional testing. Perform comprehensive functional and regression testing to ensure software quality. Identify, track, and resolve software defects in a timely manner. Collaborate with development and QA teams to ensure the timely delivery of high-quality software. Contribute to the continuous improvement of testing processes and methodologies. Technical Skills (By Category) Programming Languages: RequiredJava (strong knowledge) PreferredFamiliarity with other programming languages such as Python or JavaScript Frameworks and Libraries: RequiredSelenium, TestNG, Maven Development Tools and Methodologies: RequiredJenkins, JIRA PreferredAgile methodologies (Scrum) Experience Requirements Minimum of 5-10 years of experience in ETL+Functional+ lead roles. Proven experience in automation testing, particularly with Selenium and Java. Prior experience in leading testing initiatives and collaborating with cross-functional teams. Day-to-Day Activities Write, maintain, and execute automated test cases for ETL and functional testing. Troubleshoot and debug automation scripts to ensure optimal performance. Review test cases and provide constructive feedback to enhance testing processes. Participate in agile sprint planning, retrospectives, and other team meetings. Stay up-to-date with the latest advancements in testing technologies and methodologies. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience). Continuous professional development in ETL processes and automation testing technologies is encouraged. Professional Competencies Strong problem-solving and critical thinking capabilities. Effective leadership and teamwork abilities. Excellent communication and interpersonal skills. Strong attention to detail and ability to prioritize tasks effectively. Proactive, solution-oriented approach to problem-solving.

Posted 2 months ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

Kolkata, Hyderabad, Bengaluru

Work from Office

We are seeking a highly skilled ETL Architect Powered by AI (Apache NiFi/Kafka) to join our team. The ideal candidate will have expertise in managing, automating, and orchestrating data flows using Apache NiFi. In this role, you will design, implement, and maintain scalable data pipelines that handle real-time and batch data processing. The role also involves integrating NiFi with various data sources, performing data transformation tasks, and ensuring data quality and governance Key Responsibilities: Real-Time Data Integration (Apache NiFi & Kafka): Design, develop, and implement real-time data pipelines leveraging Apache NiFi for seamless data flow. Build and maintain Kafka producers and consumers for effective streaming data management across systems. Ensure the scalability, reliability, and performance of data streaming platforms using NiFi and Kafka. Monitor, troubleshoot, and optimize data flow within Apache NiFi and Kafka clusters. Manage schema evolution and support data serialization formats such as Avro , JSON , and Protobuf . Set up, configure, and optimize Kafka topics, partitions, and brokers for high availability and fault tolerance. Implement backpressure handling, prioritization, and flow control strategies in NiFi data flows. Integrate NiFi flows with external services (e.g., REST APIs , HDFS , RDBMS ) for efficient data movement. Establish and maintain secure data transmission, access controls, and encryption mechanisms in NiFi and Kafka environments. Develop and maintain batch ETL pipelines using tools like Informatica , Talend , and custom Python/SQL scripts . Continuously optimize and refactor existing ETL workflows to improve performance, scalability, and fault tolerance. Implement job scheduling, error handling, and detailed logging mechanisms for data pipelines. Conduct data quality assessments and design frameworks to ensure high-quality data integration. Design and document both high-level and low-level data architectures for real-time and batch processing. Lead technical evaluations of emerging tools and platforms for potential adoption into existing systems. Qualifications we seek in you: Minimum Qualifications / Skills: Bachelors degree in computer science , Information Technology , or a related field. Significant experience in IT with a focus on data architecture and engineering . Proven experience in technical leadership , driving data integration projects and initiatives. Certifications in relevant technologies (e.g., AWS Certified Solutions Architect , Microsoft Certified: Azure Data Engineer ) are a plus. Strong analytical skills and the ability to translate business requirements into effective technical solutions. Proficiency in communicating complex technical concepts to non-technical stakeholders. Preferred Qualifications / Skills: Extensive hands-on experience as a Data Architect . In-depth experience with Apache NiFi , Apache Kafka , and related ecosystem components (e.g., Kafka Streams , Schema Registry ). Ability to develop and optimize NiFi processors to handle various data sources and formats. Proficient in creating reusable NiFi templates for common data flows and transformations. Familiarity with integrating NiFi and Kafka with big data technologies like Hadoop , Spark , and Databricks . At least 2 end-to-end implementations of data integration solutions in a real-world environment. Experience in metadata management frameworks and scalable data ingestion processes. Solid understanding of data platform design patterns and best practices for integrating real-time data systems. Knowledge of ETL processes , data integration tools, and data modeling techniques. Demonstrated experience in Master Data Management (MDM) and data privacy standards . Experience with modern data platforms such as Snowflake , Databricks , and big data tools. Proven ability to troubleshoot complex data issues and implement effective solutions . Strong project management skills with the ability to lead data initiatives from concept to delivery. Familiarity with AI/ML frameworks and their integration with data platforms is a plus. Excellent communication and interpersonal skills , with the ability to collaborate effectively across cross-functional teams . Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 13 Lacs

Pune, Chennai, Bengaluru

Hybrid

Key Responsibilities: Lead the delivery and implementation of scalable BI solutions using IBM Cognos. Collaborate with cross-functional teams to define project scope, timelines, and resource requirements. Design and implement custom, device-responsive layouts for dashboards and reports. Integrate dynamic data sources, enabling real-time or near-real-time data updates. Utilize advanced charting libraries to create visually engaging and informative data visualizations. Apply frontend and backend scripting frameworks to extend dashboard capabilities and interactivity. Ensure BI solutions are optimized for performance, maintainability, and usability. Provide leadership and mentoring to junior developers or analysts on BI best practices and delivery standards. Preferred candidate profile 4 to 5 years of hands-on experience with IBM Cognos BI tools. • Proven experience leading large-scale BI/analytics projects from design through deployment. • Strong understanding of responsive dashboard design and UI/UX best practices. • Proficiency with scripting languages and frameworks for both frontend (JavaScript, HTML/CSS) and backend (e.g., Python, SQL) - Good to have not mandatory • Expertise in integrating charting and visualization libraries (e.g., D3.js, Highcharts)-Good to have not mandatory

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE Role Description: The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications and Experience: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Experience with Anaplan platform, including building, managing, and optimizing models and workflows including scalable data integrations Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 2 months ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Gurugram

Work from Office

Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills.

Posted 2 months ago

Apply

5.0 - 7.0 years

7 - 11 Lacs

Pune

Work from Office

: We are seeking a highly skilled and experienced MSTR (MicroStrategy) Developer to join our Business Intelligence team. In this role, you will be responsible for the design, development, implementation, and maintenance of robust and scalable BI solutions using the MicroStrategy platform. Your primary focus will be on leveraging your deep understanding of MicroStrategy architecture and strong SQL skills to deliver insightful and actionable data to our stakeholders. This is an excellent opportunity to contribute to critical business decisions by providing high-quality BI solutions. Responsibilities - Design, develop, and deploy MicroStrategy objects including reports, dashboards, cubes (Intelligent Cubes, OLAP Cubes), documents, and visualizations. - Utilize various MicroStrategy features and functionalities such as Freeform SQL, Query Builder, MDX connectivity, and data blending. - Optimize MicroStrategy schema objects (attributes, facts, hierarchies) for performance and usability. - Implement security models within MicroStrategy, including user and group management, object-level security, and data-level security. - Perform performance tuning and optimization of MicroStrategy reports and dashboards. - Participate in the administration and maintenance of the MicroStrategy environment, including metadata management, project configuration, and user support. - Troubleshoot and resolve issues related to MicroStrategy reports, dashboards, and the overall platform. - Write complex and efficient SQL queries to extract, transform, and load data from various data sources. - Understand database schema design and data modeling principles. - Optimize SQL queries for performance within the MicroStrategy environment. - Work with different database platforms (e.g., Oracle, SQL Server, Teradata, Snowflake) and understand their specific SQL dialects. - Develop and maintain database views and stored procedures to support MicroStrategy development. - Collaborate with business analysts and end-users to understand their reporting and analytical requirements. - Translate business requirements into technical specifications for MicroStrategy development. - Participate in the design and prototyping of BI solutions. - Develop and execute unit tests and integration tests for MicroStrategy objects. - Participate in user acceptance testing (UAT) and provide support to end-users during the testing phase. - Ensure the accuracy and reliability of data presented in MicroStrategy reports and dashboards. - Create and maintain technical documentation for MicroStrategy solutions, including design documents, user guides, and deployment instructions. - Provide training and support to end-users on how to effectively use MicroStrategy reports and dashboards. - Adhere to MicroStrategy best practices and development standards. - Stay updated with the latest MicroStrategy features and functionalities. - Proactively identify opportunities to improve existing MicroStrategy solutions and processes. Required Skills and Expertise - Strong proficiency in MicroStrategy development (5+ years of hands-on experience is essential). This includes a deep understanding of the MicroStrategy architecture, object creation, report development, dashboard design, and administration. - Excellent SQL skills (5+ years of experience writing complex queries, optimizing performance, and working with various database systems). - Experience in data modeling and understanding of dimensional modeling concepts (e.g., star schema, snowflake schema). - Solid understanding of BI concepts, data warehousing principles, and ETL processes. - Experience in performance tuning and optimization of MicroStrategy reports and SQL queries. - Ability to gather and analyze business requirements and translate them into technical specifications. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills, with the ability to work effectively with both technical and business stakeholders. - Experience with version control systems (e.g., Git). - Ability to work independently and as part of a team.

Posted 2 months ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

: As a SaaS Developer focused on SSAS and OLAP, you will play a critical role in our data warehousing and business intelligence initiatives. You will work closely with data engineers, business analysts, and other stakeholders to ensure the delivery of accurate and timely data insights. Your expertise in SSAS development, performance optimization, and data integration will be essential to your success. Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models (multidimensional and tabular). - Create and manage OLAP cubes to support business intelligence reporting and analytics. - Implement best practices for data modeling and cube design. - Optimize the performance of SSAS solutions for efficient query processing and data retrieval. - Tune SSAS models and cubes to ensure optimal performance. - Identify and resolve performance bottlenecks. - Integrate data from various sources (relational databases, flat files, APIs) into SQL Server databases and SSAS models. - Develop and implement ETL (Extract, Transform, Load) processes for data integration. - Ensure data quality and consistency across integrated data sources. - Support the development of business intelligence reports and dashboards. - Collaborate with business analysts to understand reporting requirements and translate them into SSAS solutions. - Provide technical support and troubleshooting for SSAS-related issues. - Preferably have knowledge of AWS S3 and SQL Server PolyBase for data integration and cloud-based data warehousing. - Integrate data from AWS S3 into SSAS models using PolyBase or other appropriate methods. Required Skills & Qualifications : Experience : - 5-8 years of experience as a SQL Developer with a focus on SSAS and OLAP. - Proven experience in designing and developing multidimensional and tabular SSAS models. Technical Skills : - Strong expertise in SQL Server Analysis Services (SSAS) and OLAP cube development. - Proficiency in writing MDX and DAX queries. - Experience with data modeling and database design. - Strong understanding of ETL processes and data integration techniques. - Experience with SQL Server databases and related tools. - Preferably knowledge of AWS S3 and SQL Server PolyBase.

Posted 2 months ago

Apply

3.0 - 7.0 years

17 - 20 Lacs

Bengaluru

Work from Office

We are seeking a skilled and experienced Cognos TM1 Developer with a strong background in ETL processes and Python development. The ideal candidate will be responsible for designing, developing, and supporting TM1 solutions, integrating data pipelines, and automating processes using Python. This role requires strong problem-solving skills, business acumen, and the ability to work collaboratively with cross-functional teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications 5 to 8 years of hands-on experience in Siemens Teamcenter PLM implementation, customization, and deployment. Sound knowledge of PLM processes such as BOM Management, Change Management, Document Management, and Configuration Management Preferred technical and professional experience Strong problem-solving skills and the ability to work independently as well as part of a team. Good understanding of SDLC practices, with experience using tools like JIRA, Git, Jenkins, and Confluence. Ability to interact effectively with cross-functional teams and translate business requirements into PLM

Posted 2 months ago

Apply

3.0 - 8.0 years

3 - 8 Lacs

Jaipur, Delhi / NCR, Mumbai (All Areas)

Work from Office

Collect, analyze, and visualize data to uncover insights and guide business decisions. Use SQL and analytical tools to identify trends, create reports, and collaborate with teams to drive data-informed strategies.

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 12 Lacs

Bengaluru

Work from Office

Primary Must Have Skills - Strong experience working with ETL tool IBM Info sphere Data Stage to develop data pipelines and Data Warehousing. Strong hands-on experience on DataBricks. Have strong hands-on experience with SQL and relational databases Proactive with strong communication and interpersonal skills to effectively collaborate with team members and stakeholders. Strong understanding of data processing concepts (ETL) The candidate should be prepared to sometimes step outside the developer role to gather and create their own analysis and requirements. Secondary skills required Experience in T-SQL writing stored procedures. Moderate Experience in AWS Cloud services. Ability to write sufficient and comprehensive documentation about data processing flow.

Posted 2 months ago

Apply

8.0 - 11.0 years

18 - 30 Lacs

Pune

Hybrid

So, what’s the role all about? A Specialist Software Engineer in Java Backend typically holds a prominent position in software development teams, responsible for designing, developing, and implementing complex software solutions that leverage backend technologies. Here's an overview of the key responsibilities and qualifications for this role: How will you make an impact? Participate in our product development from ideation to deployment and beyond. Maintain quality, ensure responsiveness, and help optimize new and existing systems. Collaborate with the rest of the engineering team to design and build new features on time and to budget. Maintain code integrity and organization. Understanding and implementation of security and data protection. Understanding of the Business Change cycle from inception to implementation, including the organization of Change initiative Highly experienced with the Java programming languages and related frameworks such as Spring and Hibernate. Developing front end and proficient experience using Angular 10+. Ability to coordinate build and release activities with key stakeholders. Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 8-11 years of software development experience At least 8 years of working experience in Core Java, proficient with Java algorithms and data structures Worked in high performance, highly available and scalable systems Strong experience with J2EE, Spring Framework, IOC, annotations Experience in any object-relational mapping (e.g. Hibernate) Strong knowledge of OOAD and Design patterns Development experience building solutions that leverage SQL and NoSQL databases Strong Development experience creating RESTful Web APIs Experience designing and developing scalable multi-tenant SaaS-based solutions Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc You will have an advantage if you also have: Knowledge of BIG DATA and ETL Concepts (or BI tool like Tableau) will be added advantage Experience designing and developing scalable multi-tenant SaaS-based solutions Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Development experience in JavaScript, Angular/ReactJS, Java Server Pages, Java Server Faces will be added plus Experience working in an Agile methodology development environment and using work item management tools like JIRA Ability to work independently and collaboratively, good communication skill Bring a culture of Innovation to the job Ability to work under high pressure High attention to details and accuracy What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 6876 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 2 months ago

Apply

4.0 - 7.0 years

9 - 15 Lacs

Pune

Hybrid

About Client Hiring for One of the top most MNC! Job Description Job Title : Power BI Developer Qualification : Any Graduate or Above Relevant Experience : 4 to 7 Years Must Have Skills : Power BI Development SQL ETL / ELT processes. Location : Pune CTC Range : 9-18(Lakhs Per Annum) Notice period : Immediate/15days Shift Timing : NA Mode of Interview : Virtual Mode of Work : Hybrid Deepika.v Staffing analyst - IT recruiter Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA deepika.venkatesh@blackwhite.in I www.blackwhite.in 08067432434

Posted 2 months ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Job_Description":" Data Modeler JD: Proven experience as a Data Modeler or in a similar role (8 years depending on seniority level). Proficiency in data modeling tools (e.g., ER/Studio, Erwin, SAP PowerDesigner, or similar). Strong understanding of database technologies (e.g., SQL Server, Oracle, PostgreSQL, Snowflake). Experience with cloud data platforms (e.g., AWS, Azure, GCP). Familiarity with ETL processes and tools. Excellent knowledge of normalization and denormalization techniques. Strong analytical and problem-solving skills.

Posted 2 months ago

Apply

15.0 - 20.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Job Title:Data Analytics Lead Experience15-20 Years Location:Bengaluru : Key Responsibilities Team Leadership & Delivery Excellence: Lead a cross-functional team comprising data architects, analysts, business SMEs, and technologists to deliver high-impact data analytics solutions. Define and enforce best practices for efficient, scalable, and high-quality delivery. Inspire a culture of collaboration, accountability, and continuous improvement within the team. Strategic Data Leadership: Develop and execute a data strategy aligned with client business objectives, ensuring seamless integration of analytics into decision-making processes. Collaborate with stakeholders to translate business needs into actionable data solutions, influencing strategic decisions. Technical and Architectural Expertise: Architect and oversee data platforms, including SQL Server, Snowflake, and Power BI, ensuring optimal performance, scalability, and governance. Lead initiatives in Data Architecture, Data Modeling, and Data Warehouse (DWH) development, tailored to alternative investment strategies. Evaluate emerging technologies, such as big data and advanced analytics tools, and recommend their integration into client solutions. Champion data quality, integrity, and security, aligning with compliance standards in private equity and alternative investments. Performance & Metrics: Define and monitor KPIs to measure team performance and project success, ensuring timely delivery and measurable impact. Collaborate with stakeholders to refine reporting, dashboarding, and visualization for decision support. Governance & Compliance: Establish robust data governance frameworks in partnership with client stakeholders. Ensure adherence to regulatory requirements impacting private markets investments, including fund accounting and compliance What’s on offer Competitive and above-market salary. Flexible hybrid work schedule with tools for both office and remote productivity. Hands-on exposure to cutting-edge technology and global financial markets. Opportunity to collaborate directly with international teams in New York and London. Candidate Profile Experience: 15+ years of progressive experience in program or project management within the capital markets and financial services sectors. Demonstrated expertise in SQL Server, Snowflake, Power BI, ETL processes, and Azure Cloud Data Platforms. Hands-on experience with big data technologies and modern data architecture. Proven track record in delivering projects emphasizing data quality, integrity, and accuracy. Deep understanding of private markets, including areas such as private equity, private credit, CLOs, compliance, and regulations governing alternative investments. Leadership & Collaboration: Exceptional problem-solving skills and decision-making abilities in high-pressure, dynamic environments. Experience leading multi-disciplinary teams to deliver large-scale data initiatives. Strong client engagement and communication skills, fostering alignment and trust with stakeholders. Preferred Certifications: Relevant certifications (e.g., CFA, Snowflake Certified Architect, or Microsoft Power BI Certified). Education Bachelor’s degree in computer science, IT, Finance, Economics, or a related discipline. Advanced degrees (MBA, MS in Computer Science, or related fields) preferred. Interview Process Initial recruiter call. Interview with technical team, delivery and account leadership team at ThoughtFocus. Interview with the client stakeholders. Final HR discussion.

Posted 2 months ago

Apply

6.0 - 8.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Job Title:ETL Developer Snap Logic Experience:6-8 Years Location:Bangalore : Technical Skills: Design, develop, and maintain SnapLogic pipelines to support integration projects. Build and manage APIs using SnapLogic to connect various data sources and systems. Leverage SnapLogic agent functionality to enable secure and efficient data integration. Collaborate with cross-functional teams to gather requirements and ensure solutions meet business needs. Troubleshoot and optimize existing SnapLogic integrations to improve performance and reliability. Document integration processes and provide guidance to team members on best practices. Proven experience with SnapLogic, including API builds and agent functionality. Strong understanding of integration patterns and best practices. Proficiency in data integration and ETL processes. Expertise on Relational Databases Oracle, SSMS and familiar with NO SQL DB MongoDB Knowledge of data warehousing concepts and data modelling Experience of performing validations on large-scale datax`x` Strong Rest API ,JSON’s and Data transformations experience Experience with Unit Testing and Integration Testing Familiarity with large language models (LLMs) and their integration with data pipelines. Experience in database architecture and optimization. Knowledge of U.S. healthcare systems, data standards (e.g., HL7, FHIR), and compliance requirements (e.g., HIPAA). Behavioral Skills: Excellent documentation and presentation skills, analytical and critical thinking skills, and the ability to identify needs and take initiative Follow engineering best practices and principles within your organisation Work closely with a Lead Software Engineer Be an active member of the MMC Technology community – contribute, collaborate, and learn Build strong relationships with members of your engineering squad

Posted 2 months ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake data engineer1 Overall Experience 5+ years of experience in IICS and SnowflakeProven experience in implementing ETL solutions with a focus on IICS and Snowflake.Strong hands-on development experience in IICS, including CDI, CAI, and Mass Ingestion.Proficiency in using various connectors for different source/file formats and databases.Knowledge of web services and their integration into ETL processes.Administration skills related to IICS, ensuring a smooth operational environment.Proven experience as a Snowflake Developer with a strong focus on data warehousing.Hands-on experience in designing and implementing data models within the Snowflake environment.Proficient in developing ETL processes using Snowflake features and SQL.Knowledge of security best practices and access control within Snowflake.Familiarity with data integration and data warehouse concepts.Experience with data migration to Snowflake from other platforms is a plus.Excellent problem-solving skills with the ability to analyze and resolve critical issues.Strong organizational and project management skills.Effective communication skills for customer interactions and status updates.Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success.Eager to contribute to a team-oriented environment.Strong prioritization and multi-tasking skills with a track record of meeting deadlines.Ability to be creative and analytical in a problem-solving environment.Effective verbal and written communication skills.Adaptable to new environments, people, technologies, and processesAbility to manage ambiguity and solve undefined problems.

Posted 2 months ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Role SQL developer Location Bangalore Experience 4 + years Employment Type Full Time, Working mode Regular Notice Period Immediate - 15 Days Job Summary : As a Software Engineer / Senior Software Engineer (Database), you will play a pivotal role in designing, developing, and maintaining the database infrastructure for our core product. You will collaborate closely with the development team to ensure that our database solutions are scalable, efficient, and aligned with our business objectives. Key Responsibilities : 1. Database Design and Development : - Develop and implement database models, views, tables, stored procedures, and functions to support product development. - Design and maintain SSIS packages, T-SQL scripts, and SQL jobs. - Optimize database performance through query tuning, indexing, and partitioning. 2. Data Integration : - Develop complex stored procedures for loading data into staging tables from various sources. - Ensure data integrity and consistency across different systems. 3. Data Analytics : - Collaborate with data analysts to design and implement data analytics solutions using tools like SQL Server, SSIS, SSRS, and Excel Power Pivot/View/Map. 4. Documentation Document complex processes, business requirements, and specifications. 5. Database Administration : - Provide authentication and authorization for database access. - Develop and enforce best practices for database design and development. - Manage database migration activities. Required Skills : Technical Skills : - Strong proficiency in MS SQL Server (query tuning, stored procedures, functions, views, triggers, indexes, column store index, SQL server column storage, query execution plan). - Experience with database design, normalization, and performance optimization. - Knowledge of data warehousing and ETL processes. - Experience with SSIS, SSRS, and Excel Power Pivot/View/Map. Soft Skills : - Excellent analytical, problem-solving, and communication skills. - Ability to work independently and as part of a team. - Attention to detail and commitment to quality. Benefits : - Competitive salary and benefits package - Opportunities for professional growth and development - Remote work flexibility Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Job Position Python Lead Total Exp Required 6+ years Relevant Exp Required around 5 Mandatory skills required Strong Python coding and development Good to have skills required Cloud, SQL , data analysis skills Location Pune - Kharadi - WFO - 3 days/week. About The Role : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

4.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

About The Role : - Minimum 4 years of experience in relevant field. - Hands on experience in Databricks, SQL, Azure Data Factory, Azure DevOps - Strong expertise in Microsoft Azure cloud platform services (Azure Data Factory, Azure Data Bricks, Azure SQL Database, Azure Data Lake Storage, Azure Synapse Analytics). - Proficient in CI-CD pipelines in Azure DevOps for automatic deployments - Good in Performance optimization techniques like using temp tables, CTE, indexing, merge statements, joins. - Familiarity in Advanced SQL and programming skills (e.g., Python, Pyspark). - Familiarity with data warehousing and data modelling concepts. - Good in Data management and deployment processes using Azure Data factory and Databricks, Azure DevOps. - Knowledge on integrating every azure service with DevOps - Experience in designing and implementing scalable data architectures. - Proficient in ETL processes and tools. - Strong communication and collaboration skills. - Certifications in relevant Azure technologies are a plus Location Bangalore/ Hyderabad Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Gurugram

Work from Office

Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 7.0 years

6 - 11 Lacs

Pune

Work from Office

About The Role : We are seeking a highly skilled and experienced MSTR (MicroStrategy) Developer to join our Business Intelligence team. In this role, you will be responsible for the design, development, implementation, and maintenance of robust and scalable BI solutions using the MicroStrategy platform. Your primary focus will be on leveraging your deep understanding of MicroStrategy architecture and strong SQL skills to deliver insightful and actionable data to our stakeholders. This is an excellent opportunity to contribute to critical business decisions by providing high-quality BI solutions. Responsibilities - Design, develop, and deploy MicroStrategy objects including reports, dashboards, cubes (Intelligent Cubes, OLAP Cubes), documents, and visualizations. - Utilize various MicroStrategy features and functionalities such as Freeform SQL, Query Builder, MDX connectivity, and data blending. - Optimize MicroStrategy schema objects (attributes, facts, hierarchies) for performance and usability. - Implement security models within MicroStrategy, including user and group management, object-level security, and data-level security. - Perform performance tuning and optimization of MicroStrategy reports and dashboards. - Participate in the administration and maintenance of the MicroStrategy environment, including metadata management, project configuration, and user support. - Troubleshoot and resolve issues related to MicroStrategy reports, dashboards, and the overall platform. - Write complex and efficient SQL queries to extract, transform, and load data from various data sources. - Understand database schema design and data modeling principles. - Optimize SQL queries for performance within the MicroStrategy environment. - Work with different database platforms (e.g., Oracle, SQL Server, Teradata, Snowflake) and understand their specific SQL dialects. - Develop and maintain database views and stored procedures to support MicroStrategy development. - Collaborate with business analysts and end-users to understand their reporting and analytical requirements. - Translate business requirements into technical specifications for MicroStrategy development. - Participate in the design and prototyping of BI solutions. - Develop and execute unit tests and integration tests for MicroStrategy objects. - Participate in user acceptance testing (UAT) and provide support to end-users during the testing phase. - Ensure the accuracy and reliability of data presented in MicroStrategy reports and dashboards. - Create and maintain technical documentation for MicroStrategy solutions, including design documents, user guides, and deployment instructions. - Provide training and support to end-users on how to effectively use MicroStrategy reports and dashboards. - Adhere to MicroStrategy best practices and development standards. - Stay updated with the latest MicroStrategy features and functionalities. - Proactively identify opportunities to improve existing MicroStrategy solutions and processes. Required Skills and Expertise - Strong proficiency in MicroStrategy development (5+ years of hands-on experience is essential). This includes a deep understanding of the MicroStrategy architecture, object creation, report development, dashboard design, and administration. - Excellent SQL skills (5+ years of experience writing complex queries, optimizing performance, and working with various database systems). - Experience in data modeling and understanding of dimensional modeling concepts (e.g., star schema, snowflake schema). - Solid understanding of BI concepts, data warehousing principles, and ETL processes. - Experience in performance tuning and optimization of MicroStrategy reports and SQL queries. - Ability to gather and analyze business requirements and translate them into technical specifications. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills, with the ability to work effectively with both technical and business stakeholders. - Experience with version control systems (e.g., Git). - Ability to work independently and as part of a team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

About The Role : As a SaaS Developer focused on SSAS and OLAP, you will play a critical role in our data warehousing and business intelligence initiatives. You will work closely with data engineers, business analysts, and other stakeholders to ensure the delivery of accurate and timely data insights. Your expertise in SSAS development, performance optimization, and data integration will be essential to your success. Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models (multidimensional and tabular). - Create and manage OLAP cubes to support business intelligence reporting and analytics. - Implement best practices for data modeling and cube design. - Optimize the performance of SSAS solutions for efficient query processing and data retrieval. - Tune SSAS models and cubes to ensure optimal performance. - Identify and resolve performance bottlenecks. - Integrate data from various sources (relational databases, flat files, APIs) into SQL Server databases and SSAS models. - Develop and implement ETL (Extract, Transform, Load) processes for data integration. - Ensure data quality and consistency across integrated data sources. - Support the development of business intelligence reports and dashboards. - Collaborate with business analysts to understand reporting requirements and translate them into SSAS solutions. - Provide technical support and troubleshooting for SSAS-related issues. - Preferably have knowledge of AWS S3 and SQL Server PolyBase for data integration and cloud-based data warehousing. - Integrate data from AWS S3 into SSAS models using PolyBase or other appropriate methods. Required Skills & Qualifications : Experience : - 5-8 years of experience as a SQL Developer with a focus on SSAS and OLAP. - Proven experience in designing and developing multidimensional and tabular SSAS models. Technical Skills : - Strong expertise in SQL Server Analysis Services (SSAS) and OLAP cube development. - Proficiency in writing MDX and DAX queries. - Experience with data modeling and database design. - Strong understanding of ETL processes and data integration techniques. - Experience with SQL Server databases and related tools. - Preferably knowledge of AWS S3 and SQL Server PolyBase. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies