Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You will be responsible for designing, building, and optimizing NetSuite analytics solutions and enterprise data warehouses as a NetSuite Analytics Developer & Data Warehousing expert. Your main focus will be on leveraging NetSuite's SuiteAnalytics tools in conjunction with external data warehousing platforms like Oracle Analytics Warehouse, Google Cloud Platform (GCP), and Snowflake. This will enable you to deliver scalable, data-driven insights through advanced visualizations and reporting across the organization. Your key responsibilities will include designing, developing, and maintaining SuiteAnalytics reports, saved searches, and dashboards within NetSuite to meet the evolving business needs. You will also be required to build and optimize data pipelines and ETL processes to integrate NetSuite data into enterprise data warehouses, develop data models and schemas, and maintain data marts to support business intelligence and analytical requirements. Additionally, you will implement advanced visualizations and reports using tools such as Tableau, Power BI, or Looker, ensuring high performance and usability. Collaboration with business stakeholders to gather requirements and translate them into effective technical solutions will be crucial in this role. You will also be responsible for monitoring, troubleshooting, and optimizing data flow and reporting performance, ensuring data governance, security, and quality standards are maintained across analytics and reporting systems. Providing documentation, training, and support to end-users on analytics solutions will also be part of your responsibilities. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Systems, or a related field, along with at least 3 years of experience working with NetSuite, including SuiteAnalytics (saved searches, datasets, workbooks). Strong expertise in data warehousing concepts, ETL processes, data modeling, proficiency in SQL, and experience with BI and visualization tools like Tableau, Power BI, or Looker are essential. An understanding of data governance, compliance, and best practices in data security is also required. In summary, as a NetSuite Analytics Developer & Data Warehousing expert, you will play a vital role in designing, building, and optimizing analytics solutions and data warehouses to drive data-driven insights and reporting across the organization. Your expertise in SuiteAnalytics, data warehousing, ETL processes, and BI tools will be key in meeting the evolving business needs and ensuring high-quality analytics solutions are delivered.,
Posted 2 months ago
5.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
extractapplyapply Key Responsibilities: Mines and extracts data and applies statistics and algorithms necessary to derive insights for Digital Mine products and or services. Supports the generation of an automated insights generation framework for business partners to effectively interpret data Provides actionable insights through data science on Personalization, Search & Navigation, SEO & Promotions, Supply Chain, Services, and other related services . Develops dashboard reports that measure financial results, customer satisfaction, and engagement metrics Conducts deep statistical analysis, including predictive and prescriptive modeling in order to provide the organization a competitive advantage Maintains expert-level knowledge on industry trends, emerging technologies, and new methodologies and applies it to projects Contributes subject-matter expertise on automation and analytical projects, collaborating across functions Translates requirements into an analytical approach; asks the right questions to understand the problem; validates understanding with Stakeholder or Manager Contributes for building the analytic approach to solving a business problem; helps identify the sources, methods, parameters, and procedures to be used; clarifies expectations with stakeholders Leverages deep understanding of statistical techniques and tools to analyze data according to the project plan; communicates with stakeholders to provide updates Prepares final recommendations, ensuring solutions are best-in-class, implementable and scalable in the business Executes plans for measuring impact based on discussions with stakeholders, partners and senior team members Executes projects with full adherence to enterprise project management practices
Posted 2 months ago
5.0 - 10.0 years
5 - 10 Lacs
Gurgaon, Haryana, India
On-site
1. Data Analysis and Exploration: Conduct thorough data analysis to uncover actionable insights and trends. Collaborate with cross-functional teams to define data requirements and extract relevant data. 2. Data Visualization: Create visually compelling and informative data visualizations using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Develop interactive dashboards to present insights in an accessible and understandable manner. 3. Machine Learning and Predictive Modeling: Build, validate, and deploy machine learning models to solve business problems and predict outcomes. Optimize and fine-tune models for accuracy and efficiency. 4. Python Programming: Develop and maintain Python scripts and applications for data analysis and modeling. Write clean, efficient, and well-documented code for data manipulation and analytics. 5. ETL Pipeline Creation: Design and implement ETL pipelines to extract, transform, and load data from various sources into data warehouses. Ensure data quality, consistency, and accuracy within ETL processes. Enforce data governance best practices to maintain data quality, security, and compliance with industry regulations. Collaborate with IT and Data Engineering teams to optimize data storage and access. Work closely with cross-functional teams to understand business requirements and provide data-driven solutions. Effectively communicate findings and insights to both technical and non-technical stakeholders. Preferred candidate profile Proven experience in data science, data analysis, and predictive modeling. Proficiency in Python programming, data manipulation, and relevant libraries. Strong experience with data visualization tools (We prefer Power BI). Knowledge of ETL processes and pipeline creation. Familiarity with machine learning techniques and frameworks. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience with SQL and database management. Knowledge of CANBus protocol is a plus
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a System Developer specializing in C#, you would have the opportunity to contribute to engineering projects and the development of windows, web applications, and system integration at NNE, an international pharmaceutical engineering company. Collaborating closely with NNIT and developers in Denmark, you will play a crucial role in the IT System Development team based in Bengaluru, India. Your responsibilities will include supporting engineering projects in both Denmark and India, participating in system development from concept to implementation, and working alongside engineers, supporters, and developer teams to ensure the successful delivery of production systems. Your role will require expertise in .NET and C# development, strong database and SQL skills, and familiarity with Agile practices like SCRUM. The ideal candidate for this position would possess the following skills: - Proficiency in C# and a good understanding of its ecosystems - Ability to develop reusable C# libraries - Knowledge of various design and architectural patterns - Experience with concurrency patterns in C# - Familiarity with web application frameworks, Windows Presentation Framework, and clean, readable C# code writing - Understanding of fundamental design principles and database schemas - Experience in automated testing, code versioning tools, and continuous integration - Hands-on experience with SSIS packages and SSAS Tabular Models In addition to technical skills, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A result-oriented approach, effective communication, and the ability to work in a multicultural environment are essential qualities for this role. Fluency in English, both spoken and written, is required. If you are ready to bring your skills and passion to NNE, we encourage you to apply before the deadline of 9th May 2025. For any inquiries, please contact IJRB@nne.com. Please submit your application through our online recruitment system and include a brief explanation of why you are applying in your resume or CV. At NNE, we are committed to an inclusive recruitment process and equality of opportunity for all applicants. We look forward to hearing from qualified candidates and will be scheduling interviews on an ongoing basis. To learn more about NNE and our work, visit www.nne.com.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As an Integration Technical Specialist at Nasdaq Technology in Bangalore, India, you will be a key member of the Enterprise Solutions team. Nasdaq is a dynamic organization that constantly adapts to market changes and embraces new technologies to develop innovative solutions, aiming to shape the future of financial markets. In this role, you will be involved in delivering complex technical systems to customers, exploring new technologies in the FinTech industry, and driving central initiatives across Nasdaq's technology portfolio. Your responsibilities will include collaborating with global teams to deliver solutions and services, interacting with internal customers, designing integrations with internal and third-party systems, performing end-to-end testing, participating in the software development process, and ensuring the quality of your work. You will work closely with experienced team members in Bangalore and collaborate with Nasdaq teams in other countries. To be successful in this role, you should have 10 to 13 years of integration development experience, expertise in web services like REST and SOAP API programming, familiarity with Informatica Cloud and ETL processes, a strong understanding of AWS services such as S3, Lambda, and Glue, and a Bachelor's or Master's degree in computer science or a related field. Additionally, proficiency in Workday Integration tools, knowledge of finance organization processes, and experience in multinational companies are desirable. At Nasdaq, you will be part of a vibrant and entrepreneurial environment that encourages initiative, challenges the status quo, and values authenticity. The company promotes a culture of connection, support, and empowerment, with a hybrid work model that prioritizes work-life balance and well-being. Benefits include an annual bonus, stock ownership opportunities, health insurance, a flexible working schedule, a mentorship program, and access to online learning resources. If you are a passionate professional with a drive to deliver top technology solutions and thrive in a collaborative, innovative environment, we encourage you to apply in English at your earliest convenience. Nasdaq is committed to providing reasonable accommodations for individuals with disabilities during the application and interview process. Come as you are and join us in shaping the future of financial markets at Nasdaq.,
Posted 2 months ago
2.0 - 6.0 years
4 - 8 Lacs
Hyderabad, Telangana, India
On-site
What you ll do as a (BIDeveloperLead): Design, develop, and maintain robust data pipelines using Azure Data Factory. Implement and manage ETL processes to ensure efficient data flow and transformation. Develop and maintain data models and data warehouses using Azure SQL Database and Azure SynapseAnalytics. Create and manage Power BI reports and dashboards to provide actionable insights to stakeholders. Ensure data quality, integrity, and security across all data systems. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize data storage and retrieval processes for performance and cost efficiency. Monitor and troubleshoot data pipelines and workflows to ensure smooth operations. Create and maintain tabular models for efficient data analysis and reporting. Stay updated with the latest Azure services and best practices to continuously improve data infrastructure. What will you bring to the team: Bachelor s degree in computer science, Information Technology, or a related field. Certification in Azure Data Engineer or related Azure certifications will be an added advantage. Experience with machine learning and AI services on Azure will be an added advantage. Proven experience in designing and maintaining data pipelines using Azure Data Factory. Strong proficiency in SQL and experience with Azure SQL Database. Hands-on experience with Azure Synapse Analytics and Azure Data Lake Storage. Proficiency in creating and managing Power BI reports and dashboards. Knowledge of Azure DevOps for CI/CD pipeline implementation. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Knowledge of data governance and compliance standards.
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The Manager, Business Insights plays a crucial role in the Business Insights team, ensuring that all Services can effectively make data-driven decisions and operate efficiently. As the Manager, Business Insights, you will collaborate with leadership across Sales, Delivery, Product, and other functions to enhance strategic decision-making through the use of facts and data. Your passion lies in identifying strategic gaps and opportunities within operational functions and determining corrective measures. Your expertise includes building data-driven infrastructure, configuring systems, managing data storage, and utilizing BI platforms. Additionally, you have a track record of driving productivity enhancements by identifying, procuring, or developing technology solutions that align with business requirements. In this role, your responsibilities will include influencing decision-making processes to achieve better outcomes in a dedicated Services function. You will provide functional leaders with a fact-based approach and act as a thought partner. You will develop appropriate measurement frameworks, KPIs, and analysis questions to evaluate the health of the business within a specific function. By optimizing automation, simplifying processes, and fostering hands-on partnerships, you will ensure that team members can focus on their core responsibilities. Furthermore, you will lead special projects that lack a clear owner, such as M&A integration and Agile Initiatives, by forming cross-functional teams. Your success in this role will be driven by your exceptional problem-solving skills and your ability to develop scalable and automated frameworks and processes. You possess a deep understanding of business operations within a Services function and a strong desire to expand your knowledge. Proficiency in handling various data sets (e.g., Financial, Sales & Marketing, Costs) and familiarity with databases and data analytic tools (e.g., SQL, ETL Processes, Tableau, Salesforce) are essential. Your technical acumen, coupled with experience in collaborating with internal developers and configuring third-party technical systems, distinguishes you. A well-rounded skill set and a generalist mentality, with previous consulting experience being advantageous, are qualities that you bring to the table. An effective verbal and written communicator at all organizational levels, you excel in articulating ideas and concepts. If you are looking to drive data-driven decision-making, enhance operational efficiency, and lead impactful initiatives within a dynamic Services environment, this role as a Manager, Business Insights is an excellent opportunity for you.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You are seeking a NetSuite Analytics Developer & Data Warehousing expert to design, build, and optimize NetSuite analytics solutions and enterprise data warehouses. Your role will involve leveraging NetSuite's SuiteAnalytics tools and external data warehousing platforms such as Oracle Analytics Warehouse, Google Cloud Platform (GCP), and Snowflake to deliver scalable, data-driven insights through advanced visualizations and reporting across the organization. As a candidate, you will be responsible for designing, developing, and maintaining SuiteAnalytics reports, saved searches, and dashboards within NetSuite to meet evolving business needs. You will also build and optimize data pipelines and ETL processes to integrate NetSuite data into enterprise data warehouses, develop data models and schemas, and maintain data marts to support business intelligence and analytical requirements. Additionally, you will implement advanced visualizations and reports using tools such as Tableau, Power BI, or Looker, ensuring high performance and usability. Collaboration with business stakeholders to gather requirements and translate them into effective technical solutions, monitoring, troubleshooting, and optimizing data flow and reporting performance, ensuring data governance, security, and quality standards are upheld across analytics and reporting systems, providing documentation, training, and support to end-users on analytics solutions are also part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, or a related field, along with a minimum of 3 years of experience working with NetSuite, including SuiteAnalytics (saved searches, datasets, workbooks). You should have strong expertise in data warehousing concepts, ETL processes, and data modeling, hands-on experience with external data warehouse platforms such as Oracle Analytics Warehouse, GCP (BigQuery), or Snowflake, proficiency in SQL and performance optimization of complex queries, experience with BI and visualization tools like Tableau, Power BI, or Looker, and an understanding of data governance, compliance, and best practices in data security. In summary, as a NetSuite Analytics Developer & Data Warehousing expert, you will play a crucial role in designing, building, and optimizing NetSuite analytics solutions and enterprise data warehouses, ensuring the delivery of scalable, data-driven insights through advanced visualizations and reporting across the organization.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at our company, you will be responsible for handling ETL processes using PySpark, SQL, Microsoft Fabric, and other relevant technologies. You will collaborate with clients and stakeholders to comprehend data requirements and devise efficient data models and solutions. Additionally, optimizing and tuning existing data pipelines for enhanced performance and scalability will be a crucial part of your role. Ensuring data quality and integrity throughout the data pipeline and documenting technical designs, processes, and procedures will also be part of your responsibilities. It is essential to stay updated on emerging technologies and best practices in data engineering and contribute to building CICD pipelines using Github. To qualify for this role, you should hold a Bachelor's degree in computer science, engineering, or a related field, along with a minimum of 3 years of experience in data engineering or a similar role. A strong understanding of ETL concepts and best practices is required, as well as proficiency in Azure Synapse, Microsoft Fabric, and other data processing technologies. Experience with cloud-based data platforms such as Azure or AWS, knowledge of data warehousing concepts and methodologies, and proficiency in Python, PySpark, and SQL programming languages for data manipulation and scripting are also essential. Desirable qualifications include experience with data lake concepts, familiarity with data visualization tools like Power BI or Tableau, and certifications in relevant technologies such as Microsoft Certified: Azure Data Engineer Associate. Our company offers various benefits including group medical insurance, cab facility, meals/snacks, and a continuous learning program. Stratacent is a Global IT Consulting and Services firm with headquarters in Jersey City, NJ, and global delivery centers in Pune and Gurugram, along with offices in the USA, London, Canada, and South Africa. Specializing in Financial Services, Insurance, Healthcare, and Life Sciences, we assist our customers in their transformation journey by providing services in Information Security, Cloud Services, Data and AI, Automation, Application Development, and IT Operations. For more information, you can visit our website at http://stratacent.com.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
bhubaneswar
On-site
The software development lead plays a crucial role in developing and configuring software systems, whether it is for the entire product lifecycle or for specific stages. As a Software Development Lead, your main responsibilities include collaborating with different teams to ensure that the software meets client requirements, applying your expertise in technologies and methodologies to effectively support projects, and overseeing the implementation of solutions that improve operational efficiency and product quality. You are expected to act as a subject matter expert (SME) and manage the team to deliver high-quality results. Your role involves making team decisions, engaging with multiple teams to contribute to key decisions, providing solutions to problems for your team and others, and facilitating knowledge sharing sessions to enhance team capabilities. Additionally, you will monitor project progress to ensure alignment with strategic goals. In terms of professional and technical skills, proficiency in AWS BigData is a must. You should have a strong understanding of data processing frameworks like Apache Hadoop and Apache Spark, experience in cloud services and architecture (especially in AWS environments), familiarity with data warehousing solutions and ETL processes, and the ability to implement data security and compliance measures. Candidates applying for this role should have a minimum of 5 years of experience in AWS BigData. The position is based at our Bhubaneswar office, and a 15 years full-time education is required to be eligible for this role.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in infrastructure is dedicated to designing and implementing secure and robust IT systems that facilitate business operations. We focus on ensuring the smooth functioning of networks, servers, and data centers to enhance performance and reduce downtime. As part of the infrastructure engineering team at PwC, your role will involve creating and implementing scalable technology infrastructure solutions for our clients. This will encompass tasks such as network architecture, server management, and experience in cloud computing. We are currently seeking a Data Modeler with a solid background in data modeling, metadata management, and optimizing data systems. In this role, you will be responsible for analyzing business requirements, developing long-term data models, and maintaining the efficiency and consistency of our data systems. Key Responsibilities: - Analyze business needs and translate them into long-term data model solutions. - Evaluate existing data systems and suggest enhancements. - Define rules for data translation and transformation across different models. - Collaborate with the development team to design conceptual data models and data flows. - Establish best practices for data coding to ensure system consistency. - Review modifications to existing systems to ensure cross-compatibility. - Implement data strategies and create physical data models. - Update and optimize local and metadata models. - Utilize canonical data modeling techniques to improve system efficiency. - Evaluate implemented data systems for discrepancies, variances, and efficiency. - Troubleshoot and optimize data systems to achieve optimal performance. Key Requirements: - Proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, and PowerDesigner. - Strong skills in SQL and database management systems like Oracle, SQL Server, MySQL, and PostgreSQL. - Familiarity with NoSQL databases such as MongoDB and Cassandra, including their data structures. - Hands-on experience with data warehouses and BI tools like Snowflake, Redshift, BigQuery, Tableau, and Power BI. - Knowledge of ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or related areas. - 4+ years of practical experience in dimensional and relational data modeling. - Expertise in metadata management and relevant tools. - Proficiency in data modeling tools like Erwin, Power Designer, or Lucid. - Understanding of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions, such as AWS, Azure, and GCP. - Knowledge of big data technologies like Hadoop, Spark, and Kafka. - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Strong communication and presentation skills. - Excellent interpersonal skills to collaborate effectively with diverse teams.,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a PowerBI Developer, you will be responsible for developing and maintaining scalable data pipelines using Python and PySpark. Your role will involve collaborating with data engineers and data scientists to understand and fulfill data processing needs. You will be expected to optimize and troubleshoot existing PySpark applications for performance improvements and write clean, efficient, and well-documented code following best practices. Participation in design and code reviews is essential to ensure high-quality deliverables. Moreover, you will play a key role in developing and implementing ETL processes to extract, transform, and load data. It will be your responsibility to ensure data integrity and quality throughout the data lifecycle. Staying current with the latest industry trends and technologies in big data and cloud computing is crucial to excel in this role. The ideal candidate should have a minimum of 6 years of experience in designing and developing advanced Power BI reports and dashboards. Proficiency in data modeling and DAX calculations is required, along with experience in developing and maintaining data models, creating reports and dashboards, analyzing and visualizing data, and ensuring data governance and compliance. Troubleshooting and optimizing Power BI solutions will also be part of your responsibilities. Preferred skills for this role include a strong proficiency in Power BI Desktop, DAX, Power Query, and data modeling. Experience in analyzing data, creating visualizations, and building interactive dashboards is highly valued. Additionally, you should possess excellent communication and collaboration skills to effectively work with stakeholders. Familiarity with SQL and data warehousing concepts, as well as experience with UI/UX development, will be beneficial in successfully fulfilling the responsibilities of this position.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Senior Power BI Developer at Magna, you will play a crucial role in interpreting business needs and translating them into impactful Power BI reports and data insights products. Your responsibilities will include designing, developing, integrating, and maintaining business systems through cubes, ad-hoc reports, and dashboards using cutting-edge technologies like Microsoft Fabric and Databricks. You will collaborate closely with a diverse international team spanning across Europe, North America, and Asia. Your major responsibilities will involve working closely with business analysts and stakeholders to understand data visualization requirements and develop effective BI solutions. You will utilize your expertise in DAX to create calculated measures, columns, and tables that enhance data analysis capabilities within Power BI models. Additionally, you will optimize ETL processes using tools like Power Query, SQL, Databricks, and MS Fabric to ensure accurate and consistent data integration from various sources. In this role, you will implement best practices for data modeling, performance optimization, and data governance within Power BI projects. You will also collaborate with database administrators and data engineers to maintain seamless data flow and integrity. Furthermore, you will identify and address performance bottlenecks, optimize queries and data models, and implement security measures to safeguard data confidentiality. To excel in this position, you must stay updated with Power BI advancements and industry trends, continuously seeking optimized solutions and technologies to enhance Magna's Power BI processes. Additionally, you will provide training sessions and technical support to end users, enabling self-service analytics and maximizing Power BI utilization. You will also support junior team members and collaborate with cross-functional teams to identify data-driven insights for strategic decision-making processes. To qualify for this role, you should have a University Degree and more than 3 years of work-related experience in developing Business Intelligence solutions based on Microsoft Tabular models, including Power BI visualization and complex DAX expressions. Strong SQL coding skills, experience in data modeling, ETL processes, Data Warehouse concepts, and proficiency in Microsoft BI stack are essential. Knowledge of programming languages like Python or C# is a plus, along with excellent English language skills, analytical abilities, and effective communication skills. This position may require working in the second or third shift, starting at 4:30 PM or later India time, with 10-25% regular travel. Magna offers a dynamic work environment within a global team, along with professional development opportunities and fair treatment for employees. Competitive salary and attractive benefits are provided based on skills and experience, reflecting market conditions. Join us at Magna to contribute to innovative mobility solutions and advance your career in the automotive technology industry.,
Posted 2 months ago
7.0 - 12.0 years
0 Lacs
noida, uttar pradesh
On-site
As an Insurance Domain Expert specializing in data migration with 7-12 years of experience, you will be leading and overseeing data migration projects for a top 10 global IT services provider based in Noida. Your role will involve leveraging your deep knowledge of the insurance industry and data management principles to ensure successful execution of data migration initiatives. Key Responsibilities: - Provide expert insights and best practices within the insurance domain to enhance data quality and compliance with regulatory changes. - Develop and implement comprehensive data migration strategies, including data extraction, cleaning, transformation, and loading processes. - Collaborate with technical teams and business stakeholders to align migration objectives with business goals. - Identify and mitigate potential data migration risks, ensuring accuracy and integrity through testing and validation. - Train team members and clients on new systems and data handling post-migration, offering ongoing support for data-related issues. Qualifications: - Bachelor's degree in information technology, Computer Science, or a related field; advanced degree preferred. - Minimum of 7-10 years of experience in insurance domain data migration projects. - Strong knowledge of insurance products, underwriting, claims, and regulatory requirements. - Proficiency in data migration tools and techniques, particularly in ETL processes. - Excellent analytical and problem-solving skills with attention to detail. - Effective communication and presentation skills for engaging with stakeholders. This is a full-time position based in Noida, with a hybrid work mode (2-3 times in the office per week). If you are comfortable with this arrangement and meet the qualifications and experience required, we encourage you to apply. Job Type: Full-time Schedule: Day shift, Monday to Friday Application Question(s): - What is your notice period - What is your current annual salary (in INR) - What is your expected annual salary (in INR) Education: Bachelor's (Required) Experience: - Insurance Domain data migration: 10 years (Required) - Data extraction, cleaning, transformation, and loading: 10 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
An R Shiny web developer role at Iris Software involves building interactive data visualization and analysis tools using the R programming language and the Shiny framework. The primary responsibilities include designing, developing, and deploying dashboards and applications for users to explore and manipulate data dynamically. This position also entails data manipulation, UI/UX design, and ensuring efficient application performance. Responsibilities: Building and Maintaining Shiny Applications: - Designing, developing, and maintaining interactive dashboards and applications using R and Shiny. - Implementing user interfaces (UI) and server-side logic. - Ensuring visually appealing and user-friendly applications. - Updating existing applications based on changes in functionality or requirements. - Debugging and testing applications to ensure correct functionality. Data Handling and Manipulation: - Analyzing datasets to identify relationships and prepare data for loading into databases. - Creating and managing data pipelines and ETL processes. Data Visualization and Reporting: - Developing visualizations, reports, and dashboards using R Shiny. - Creating data pipelines to integrate various data sources for comprehensive visualizations. Collaboration and Communication: - Working closely with developers, data scientists, and subject matter experts. - Communicating technical information clearly and proposing solutions to technical challenges. Mandatory Competencies: - Data Science and Machine Learning - R Shiny - User Interface - JavaScript - Communication and Collaboration - UX - Photoshop Join Iris Software to experience a supportive work environment with world-class benefits designed to support your professional and personal growth. From health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, Iris Software is committed to your success and well-being. Be part of a team that values your talent and happiness.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Technical Systems Analyst at Albemarle, you will be a key player in implementing, scaling, and optimizing Business Process Mining initiatives using Celonis across the organization. Your role will be crucial in driving the successful adoption and maturity of Celonis to meet evolving business needs, accelerate time to value, and deliver measurable operational impact. You will collaborate with business process owners, subject matter experts, and IT teams to uncover inefficiencies, design data-driven solutions, and support automation and performance improvement initiatives. Additionally, you will work closely with stakeholders to understand operational goals and translate them into actionable Celonis use cases and technical requirements. Your responsibilities will include identifying and integrating data sources, developing performant SQL queries, connecting Celonis with ERP and IT systems, ETL data into the Celonis platform, designing process dashboards and analytical reports, conducting workshops, and collaborating with cross-functional teams to prioritize and deliver Celonis use cases and dashboards on time. Furthermore, you will promote a data-driven culture by enabling self-service analytics and training business users on Celonis capabilities. You will be required to document the design, configuration, testing, deployment, and integration of new or enhanced Celonis solutions, participate in testing activities, support issue resolution, maintain documentation, support governance and compliance activities, and participate in defining and designing BPM solutions for Albemarle. Additionally, you will monitor Celonis system performance and identify opportunities for optimization. The ideal candidate for this role should have a Bachelor's degree in computer science, Information Systems, Business Analytics, or a related field, along with 3+ years of hands-on experience with Celonis EMS and 5+ years of experience in business intelligence, data engineering, or business analytics. Strong expertise in SQL, PQL, data analysis, and data visualization techniques is required, along with Celonis certifications. Proficiency with agile methodologies, scripting and ETL languages, and knowledge of data privacy, security, and compliance standards are preferred. Join Albemarle and be a part of a values-led organization committed to building a more resilient world with a focus on people and the planet. You can expect competitive compensation, a comprehensive benefits package, and resources to support your professional and personal growth. Shape the future, build with purpose, and grow together with us at Albemarle.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are a dynamic Tableau Analyst joining the Tools and Reporting team, responsible for crafting insightful dashboards and providing PMO support for project delivery. Strong data experience is required, including proficiency in Tableau, Python, or Knime, and familiarity with project management methodologies. Expertise in building Tableau dashboards, data extraction, and data modeling is essential. In your role, you will design, develop, and maintain interactive Tableau dashboards to visualize complex datasets effectively. You will use Tableau Prep for data cleaning and transformation, develop complex calculations and custom formulas, and optimize dashboard performance. Collaborating with stakeholders to gather requirements and ensure data accuracy within dashboards is key. Additionally, you will assist in planning, executing, and monitoring projects related to data visualization and analytics, applying project management methodologies such as Agile, Iterative, and Waterfall. Tracking project progress, identifying risks, and communicating updates to stakeholders are part of your responsibilities. You will also demonstrate a strong understanding of data concepts, including data modeling, data warehousing, and ETL processes. Qualifications for this role include proven experience as a Tableau Developer, proficiency in Tableau Prep, experience with Python and Knime for data manipulation, and familiarity with project management methodologies. Excellent communication, collaboration skills, and Tableau certification are required. A Bachelor's degree in Computer Science, Information Technology, or a related field is necessary. This summary provides an overview of the responsibilities and qualifications for the Tableau Analyst position. Other duties may be assigned as needed. Note: The job description does not include headers.,
Posted 2 months ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
The opportunity As a Data Migration Specialist at Hitachi Energy, you will play a crucial role in developing and executing comprehensive data migration strategies. Your responsibilities will include analyzing legacy systems, designing and implementing ETL processes using SAP BusinessObjects Data Services (BODS), and optimizing BODS jobs for performance and reliability. You will provide functional expertise in SAP SD, MM, and PP modules to ensure accurate data alignment, drive data quality improvement initiatives, and collaborate with cross-functional teams and stakeholders to ensure clear communication and documentation. Your background To excel in this role, you should hold a Bachelor's degree in Computer Science, IT, or a related field and have at least 8 years of experience in SAP data migration, including experience with SAP S/4HANA (preferred). You should be proficient in SAP BusinessObjects Data Services (BODS) and other data migration tools, possess strong functional knowledge of SAP SD, MM, and PP modules, and be skilled in data analysis, cleansing, and transformation techniques. Excellent problem-solving, analytical, and communication skills are essential, along with the ability to work both independently and collaboratively in team environments. Possessing SAP certification and project management experience would be considered a plus. Hitachi Energy is a global technology leader committed to advancing a sustainable energy future. With a focus on serving customers in the utility, industry, and infrastructure sectors, we offer innovative solutions and services across the value chain. Join our diverse team of around 45,000 employees in 90 countries who work together each day to challenge the status quo and drive innovation. Apply now to be part of our global team and contribute to a carbon-neutral future through the power of Diversity + Collaboration = Great Innovation.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for developing and supporting new standardised SAC reporting stories, reports, dashboards, and models in SAC that facilitate month-end, quarter-end, and year-end reporting processes for the Finance community. This will enable improved decision-making by granting access to pre-built standardised reports and dashboards. You will manage stakeholder requirements to ensure that the design of SAC Reports, dashboards, and models meets those requirements and enables best-in-class reporting. Additionally, you will coach SAC Analysts through the development process, from design to build, test, and deploy activities, guiding them on leading design and build practices. As an experienced manager of SAC Reporting design or development in a large organisation, you should have a strong track record of delivering efficient and effective SAC Reporting templates and models. You should be a commercially astute SAP advocate, comfortable with translating stakeholders" planning requirements into best-in-class SAC planning templates and models. Leveraging your technical expertise, you will design and deliver SAP reporting capabilities that positively transform reporting design and development. Furthermore, you should be a leader capable of nurturing talent to build in-house SAC Reporting capability. Your responsibilities will include designing and managing the development of SAC Reporting stories, reports, dashboards, and SAC models to enhance the reporting capability across BT Finance. You will advocate for SAP Reporting capability development and efficient system-based management Reporting approaches. By implementing best-in-class reporting and dashboarding development practices, you will deliver flexible and versatile solutions. Additionally, you will manage reporting capability development projects, report on progress, risks, and issue resolution. Engaging with Group and Unit Finance stakeholders, you will ensure that new SAC developments meet their reporting capability requirements. Collaborating closely with the BT Technology function, you will ensure appropriate support for new reports, dashboards, and model development projects. Making architectural decisions in line with best practices, you will manage stakeholder relationships across Finance and Technology to deliver SAP planning development project initiatives. Establishing robust checking and validation processes for planning capability development, you will maintain a zero-tolerance approach to technical errors. It is essential to have a thorough understanding of current performance trends across BT to ensure planning capability developments support key commercial challenges. Furthermore, you will develop SAC Planning Analysts through program-level coaching and review. Possessing strong hands-on technical skills in SAP Analytics Cloud (SAC), BW, and AFO is crucial. You should be proficient in SAP Analytics Cloud (SAC), including data modeling, story building, and dashboard creation. Strong experience in designing and building Analytical applications in SAC to support complex dashboards and reports is required. An understanding of SAC connectivity options, the ability to write complex calculations and formulas within SAC, and expertise in SAC reporting capability development best practices are essential. You should be able to translate functional requirements into technical design and influence stakeholders to navigate design decisions. Excellent stakeholder management and relationship building skills, along with a keen attention to detail and a solution-driven, innovative mindset focused on commercial impact, are necessary qualities. Lastly, possessing a Finance qualification (ACCA, ACA, CIMA) is a plus. The skills and experience you need: - Strong hands-on technical skills in SAP Analytics Cloud (SAC), BW, and AFO - Proficiency in SAP Analytics Cloud (SAC) including data modeling, story building, and dashboard creation - Strong experience in designing and building Analytical applications in SAC to support complex dashboards and reports - Understanding of SAC connectivity options (e.g., live data connections, import data connections) - Ability to write complex calculations and formulas within SAC to enhance reporting and dashboarding experience aligned with leading practice Strong stakeholder management and the ability to translate functional requirements into technical design are essential skills for this role.,
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a SAP BW Data Engineer, you will leverage your 7+ years of hands-on experience in SAP BW and ABAP development to drive data modeling, ETL processes, and architecture. Your expertise in SAP BW4/HANA and BW on HANA will be essential in ensuring the efficiency and effectiveness of data processes. Your proficiency in ABAP programming, including user exits, BADIs, and enhancements, will play a key role in developing and optimizing solutions. Your deep understanding of SAP BW architecture, data modeling, and ETL processes will enable you to design and implement robust data solutions. Experience with SAP Analytics tools like SAP Datasphere will further enhance your capabilities. You will excel in troubleshooting complex technical issues and delivering timely and effective solutions. Your strong communication and collaboration skills will enable you to work efficiently in a team environment. Additionally, having experience with Embedded Analytics, certification in SAP BW, SAP SAC, or related modules, and familiarity with Agile or Scrum methodologies will be advantageous. Your strong analytical and problem-solving skills will be crucial in delivering business value. Furthermore, your ability to mentor and guide junior team members will contribute to the overall success of the team.,
Posted 2 months ago
9.0 - 14.0 years
30 - 40 Lacs
Pune, Chennai
Work from Office
Designing, implementing, and optimizing data solutions using both Azure and Snowflake experience working with Matillion tool Azure and Snowflake, including data modeling, ETL processes, and data warehousing. SQL and data integration tools.
Posted 2 months ago
4.0 - 8.0 years
0 - 1 Lacs
Hyderabad, Navi Mumbai, Pune
Work from Office
Role & responsibilities Key Responsibilities: Design, develop, and deploy interactive dashboards and visualizations using TIBCO Spotfire . Work with stakeholders to gather business requirements and translate them into scalable BI solutions. Optimize Spotfire performance and apply best practices in visualization and data storytelling. Integrate data from multiple sources such as SQL databases, APIs, Excel, SAP , or cloud platforms. Implement advanced analytics using IronPython scripting , data functions , and R/Statistical integration . Conduct data profiling, cleansing, and validation to ensure accuracy and consistency. Support end-users with training, troubleshooting, and dashboard enhancements. Must-Have Skills: 58 years of experience in BI and Data Visualization . Minimum 4 years hands-on with TIBCO Spotfire including custom expressions and calculated columns. Strong knowledge of data modeling , ETL processes , and SQL scripting . Expertise in IronPython scripting for interactivity and automation within Spotfire. Experience working with large datasets and performance tuning visualizations. Good to Have: Experience with R , Python , or Statistica for advanced analytics in Spotfire. Familiarity with cloud-based data platforms (AWS Redshift, Snowflake, Azure Synapse). Understanding of data governance , metadata management , and access controls . Exposure to other BI tools like Tableau, Power BI , or QlikView .
Posted 2 months ago
6.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Calfus is a Silicon Valley headquartered software engineering and platforms company that seeks to inspire its team to rise faster, higher, stronger, and work together to build software at speed and scale. The company's core focus lies in creating engineered digital solutions that bring about a tangible and positive impact on business outcomes while standing for #Equity and #Diversity in its ecosystem and society at large. As a Data Engineer specializing in BI Analytics & DWH at Calfus, you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower the organization to make data-driven decisions. Leveraging expertise in Power BI, Tableau, and ETL processes, you will create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: - BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. - Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. - Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. - Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. - Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. - Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. - Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. - Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. - Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. - Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Qualifications: - Bachelors degree in computer science, Information Systems, Data Science, or a related field. - 6-12 years of experience in BI architecture and development, with a strong focus on Power BI and Tableau. - Proven experience with ETL processes and tools, especially SSIS. Strong proficiency in SQL Server, including advanced query writing and database management. - Exploratory data analysis with Python. - Familiarity with the CRISP-DM model. - Ability to work with different data models. - Familiarity with databases like Snowflake, Postgres, Redshift & Mongo DB. - Experience with visualization tools such as Power BI, QuickSight, Plotly, and/or Dash. - Strong programming foundation with Python for data manipulation and analysis using Pandas, NumPy, PySpark, data serialization & formats like JSON, CSV, Parquet & Pickle, database interaction, data pipeline and ETL tools, cloud services & tools, and code quality and management using version control. - Ability to interact with REST APIs and perform web scraping tasks is a plus. Calfus Inc. is an Equal Opportunity Employer.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
vadodara, gujarat
On-site
You will be working with Polymer, a smart data loss prevention (DLP) system that offers advanced cloud & AI data security and compliance solutions. By leveraging Polymer, you will play a crucial role in automating data protection processes, reducing data exposure risks, and enabling employees to enhance data security practices seamlessly within their existing workflows. Your responsibilities will include designing, developing, and maintaining ETL processes within large-scale data environments utilizing tools such as Snowflake and BigQuery. You will be tasked with constructing and deploying data pipelines to manage data ingestion, transformation, and loading operations from diverse sources. Additionally, you will create and manage data models and schemas optimized for performance and scalability, leveraging BI tools like QuickSight, Tableau, or Sigma to generate interactive dashboards and reports. Collaboration with stakeholders to grasp business requirements and convert them into technical solutions will be a key aspect of your role. You will communicate complex data insights clearly to both technical and non-technical audiences, proactively identify and resolve data quality issues and performance bottlenecks, and contribute to enhancing the data infrastructure and best practices within the organization. As a qualified candidate, you should hold a Bachelor's or Master's degree in Computer Science, Data Science, Computer Engineering, or a related field, along with 3-5 years of experience in a data science/engineering role. Proficiency in Python, including experience with Django or Flask, is essential, while expertise in Snowflake and BigQuery is advantageous. Experience with relational databases like MySQL or PostgreSQL, designing ETL processes in large-scale data environments, and working with cloud platforms such as AWS or GCP is highly valued. Your problem-solving and analytical skills, combined with a data-driven mindset, will be crucial in this role. Strong communication, interpersonal skills, and the ability to work both independently and collaboratively within a team are essential attributes. Familiarity with Agile development methodologies will be beneficial for success in this position. This is an onsite opportunity located in Vadodara, Gujarat, India.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The AIML Architect-Dataflow, BigQuery plays a crucial role within the organization by focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. Your primary responsibility will involve combining advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that improve decision-making processes across various departments. Building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in data workflows will be key. Collaboration with data engineers, data scientists, and application developers is essential to align technical vision with business goals. Your expertise in cloud-native architectures will be instrumental in driving innovation, efficiency, and insights from vast datasets. The ideal candidate will have a strong background in data processing and AI/ML methodologies and be adept at translating complex technical requirements into scalable solutions that meet the organization's evolving needs. Responsibilities: - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms to extract insights from large datasets. - Optimize data storage and retrieval processes to enhance performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Align data workflows with business objectives through collaboration with cross-functional teams. - Conduct technical evaluations of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship to junior data engineers and analysts. - Stay updated with industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Requirements: - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, specifically BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience implementing machine learning solutions in cloud environments. - Proficient in programming languages like Python, Java, or Scala. - Expertise in SQL and query optimization techniques. - Familiarity with big data workloads and distributed computing. - Knowledge of modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous.,
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |