Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
As a skilled Data Engineer, you will leverage your expertise to contribute to the development of data modeling, ETL processes, and reporting systems. With over 3 years of hands-on experience in areas such as ETL, Big Query, SQL, Python, or Alteryx, you will play a crucial role in enhancing data engineering processes. Your advanced knowledge of SQL programming and database management will be key in ensuring the efficiency of data operations. In this role, you will utilize your solid experience with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau to create insightful reports and analytics. Your understanding of data warehousing concepts and best practices will enable you to design robust data solutions. Your problem-solving skills and attention to detail will be instrumental in addressing data quality issues and proposing effective BI solutions. Collaboration and communication are essential aspects of this role, as you will work closely with stakeholders to define requirements and develop data-driven insights. Your ability to work both independently and as part of a team will be crucial in ensuring the successful delivery of projects. Additionally, your proactive approach to learning new tools and techniques will help you stay ahead in a dynamic environment. Preferred skills include experience with GCP cloud services, Python, Hive, Spark, Scala, JavaScript, and various BI/reporting tools. Your strong oral, written, and interpersonal communication skills will enable you to effectively convey insights and solutions to stakeholders. A Bachelor's degree in Computer Science, Computer Information Systems, or a related field is required for this role. Overall, as a Data Engineer, you will play a vital role in developing and maintaining data pipelines, reporting systems, and dashboards. Your expertise in SQL, BI tools, and data validation will contribute to ensuring data accuracy and integrity across all systems. Your analytical mindset and ability to perform root cause analysis will be key in identifying opportunities for improvement and driving data-driven decision-making within the organization.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Python AI/ML Developer with 4-5 years of experience, your main responsibility will be to design, develop, and maintain Python applications while ensuring code quality, efficiency, and scalability. You will collaborate with cross-functional teams to understand project requirements and deliver solutions that align with business objectives. Implementing AI/ML algorithms and models to solve complex problems and extract valuable insights from data will be a key part of your role. Additionally, you will be developing and maintaining RESTful APIs to integrate Python applications with other systems. It is essential to stay updated with the latest trends and technologies in Python development and AI/ML. To excel in this role, you should have a strong proficiency in Python programming, including object-oriented programming and design patterns. Experience with popular Python libraries and frameworks such as NumPy, Pandas, Scikit-learn, TensorFlow, and PyTorch is required. Knowledge of AI/ML concepts, algorithms, and techniques, including supervised and unsupervised learning, is crucial. Experience working with data pipelines and ETL processes is beneficial, and hands-on experience with chatbot applications is necessary. Excellent problem-solving and analytical skills are essential, along with the ability to work independently and as part of a team. Strong communication and documentation skills are also important. Preferred qualifications include experience with cloud platforms such as AWS, GCP, or Azure, knowledge of natural language processing (NLP) or computer vision, experience with machine learning deployment and operationalization, and contributions to open-source Python projects. Stay updated with the latest advancements in technology to enhance your skills and contribute effectively to the development of innovative solutions.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
As a Tableau Developer with a minimum of 4 years of experience in a similar data visualization role specialized in Tableau, you will be a valuable addition to our team in Worli, Mumbai. Your primary responsibilities will include preparing and structuring data for use in Tableau, creating and optimizing Tableau dashboards, generating insightful data reports, and collaborating closely with various teams to meet business requirements. If you are a data-driven professional with a passion for visualization and reporting, we are excited to hear from you! In this role, you will have the opportunity to work on a variety of tasks such as establishing and maintaining connections to various data sources, developing workbooks and Hyper Extracts, publishing workbooks and data sources to Tableau Server, and conducting unit testing to ensure data accuracy and functionality. You will also be responsible for troubleshooting and resolving performance or data-related issues in Tableau workbooks and data sources, as well as providing knowledge transfer and support to end-users. To be successful in this role, you must have hands-on experience in creating table calculations, functions, filters, calculated fields, parameters, reference lines, bins, sets, groups, and hierarchies in Tableau. Expertise in developing interactive dashboards, data blending, and performance tuning of Tableau dashboards is essential. Additionally, you should be proficient in writing and optimizing SQL queries for relational databases, connecting to multiple data sources, ensuring data integrity, and updating data as needed. If you are skilled in Tableau Desktop and Tableau Server, have experience with Python automation and ETL processes, and possess proficiency in Microsoft Excel, including advanced functions, pivot tables, and data manipulation, we encourage you to apply. Strong communication, problem-solving, and critical-thinking skills are also key requirements for this role. Join our dynamic and innovative environment where you will collaborate with a talented team of professionals, have opportunities for growth and learning in the field of data visualization and business intelligence, and receive competitive compensation and benefits. If you are passionate about data, analytics, and creating impactful visualizations, submit your updated resume and a brief cover letter outlining your experience in Tableau development and data visualization to apply.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
As the Global Head (f/m/d) at Siemens Energy, you play a crucial role as a digital business partner for various business units, driving digital transformation and enhancing operational efficiency. Collaborating closely with business unit heads and partners, you will understand their needs and craft effective digital solutions. Your expertise in AI and Field Service Management will be instrumental in developing innovative AI-based solutions to elevate service delivery and operational excellence. Your responsibilities include leading a team of custom developers to meet the evolving needs of Business Units and spearheading digitalization efforts. You will devise strategies to expand the IT team to support digitalization requirements through digital tools and automation initiatives. Assessing the current landscape in business units and integrating them into the IT structure will be a part of your role. Acting as a digital business partner, you will provide tailored digital solutions by leveraging visualizations, ETL processes, and low-code management tools to drive digital transformation and operational efficiency. Your role also involves designing and implementing AI-based solutions to optimize service delivery, ensuring successful deployment and adoption of digital tools across the organization. Providing leadership and guidance to the IT team to foster a culture of innovation and continuous improvement is essential. Monitoring and evaluating the performance of digital tools and automation initiatives to align with desired outcomes will be one of your key responsibilities. To qualify for this role, you should hold a Bachelor's or Master's degree in computer science, Information Technology, or a related field, along with over 10 years of IT experience focusing on software engineering, digital tools, and automation. Additionally, you should have at least 3 years of experience in leading developer teams. Your ability to translate business requirements into actionable digitalization initiatives, drive digital transformation, and possess expertise in AI, visualizations, ETL processes, and low-code management tools are crucial for this role. Strong leadership, communication, collaboration skills, and hands-on knowledge of AI are required to excel in this position. Join Siemens Energy, a global company with a commitment to sustainable and reliable energy solutions. Explore the opportunity to contribute to the energy transition and drive innovation while upholding a legacy of over 150 years. Siemens Energy values diversity and inclusion, empowering individuals from various backgrounds to work together towards a common goal of sustainable energy for all. Discover how you can be part of Siemens Energy's mission by visiting https://www.siemens-energy.com/employeevideo. Benefit from working in a diverse and inclusive environment with perks like opportunities to lead innovative projects, medical benefits, paid time off, parental leave, and continuous learning through the Learn@Siemens-Energy platform.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a talented and detail-oriented Business Intelligence (BI) Developer with a focus on developing and creating visually appealing dashboards. Your role is crucial in translating complex data sets into user-friendly visualizations to highlight key insights and trends. You will design, develop, and maintain interactive dashboards using Power BI, working with large datasets to extract, clean, and transform data for consumption. Collaborating with business stakeholders, you will understand their data needs and translate them into dashboard requirements. Regular feedback sessions with end-users will ensure that the dashboards meet business needs effectively. Your responsibilities also include optimizing dashboards for performance and usability, updating them regularly to reflect the latest data and business metrics. To qualify for this role, you should have at least 5 years of experience as a BI Developer, focusing on dashboard development. Proficiency in Power BI, strong SQL skills, and experience with database management systems are essential. You should possess excellent data visualization skills, experience with ETL processes and tools, and familiarity with data warehousing concepts and cloud platforms. Knowledge of programming languages like Python or R, understanding of data governance and security best practices, and the ability to translate business requirements into effective dashboards are also required. Strong analytical, problem-solving, communication, and collaboration skills are crucial for success in this role. A Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field is necessary for consideration.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Engineering Lead/Architect with over 10 years of experience, you will play a crucial role in architecting and designing data solutions that meet business requirements efficiently. Collaborating with cross-functional teams, you will define data architectures, models, and integration strategies to ensure the successful implementation of data pipelines, ETL processes, and data warehousing solutions. Your expertise in Snowflake technologies will be essential in building and optimizing data warehouses. You will develop and maintain Snowflake data models and schemas, following best practices such as cost analysis, resource allocation, and security configurations to support reporting and analytics needs effectively. Utilizing Azure cloud services and Databricks platforms, you will manage and process large datasets efficiently. Your responsibilities will include building, deploying, and maintaining data pipelines on Azure Data Factory, Azure Databricks, and other Azure services. Implementing best practices for data warehousing, ensuring data quality, consistency, and reliability will be a key focus area. You will also create and manage data integration processes, including real-time and batch data movement between systems. Your mastery in SQL and PL/SQL will be vital in writing complex queries to extract, transform, and load data effectively. You will optimize SQL queries and database performance for high-volume data processing to ensure seamless operations. Continuously monitoring and enhancing the performance of data pipelines and storage systems will be part of your responsibilities. You will troubleshoot and resolve data-related issues promptly to minimize downtime and maintain data availability. Documenting data engineering processes, data flows, and architectural decisions will be crucial for effective collaboration with data scientists, analysts, and stakeholders. Additionally, implementing data security measures and adhering to compliance standards like GDPR and HIPAA will be essential to protect sensitive data. In addition to your technical skills, you are expected to showcase leadership abilities by driving data engineering strategies, engaging in sales and proposal activities, developing strong customer relationships, and mentoring other team members. Your experience with cloud-based data solution architectures, client engagement, and leading technical teams will be valuable assets in this role. To qualify for this position, you should hold a bachelor's or master's degree in computer science or a related field. You must have over 10 years of experience in Data Engineering, with a strong focus on architecture. Proven expertise in Snowflake, Azure, and Databricks technologies, along with comprehensive knowledge of data warehousing concepts, ETL processes, and data integration techniques, is required. Exceptional SQL and PL/SQL skills, experience with performance tuning, and strong problem-solving abilities are essential. Excellent communication skills and relevant certifications in technologies like Snowflake and Azure will be advantageous.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a meticulous and analytical Financial Data Analyst to join our finance team. As a Financial Data Analyst, you will be responsible for managing and analyzing large datasets to provide crucial financial insights and support strategic decision-making. The ideal candidate will excel in handling and analyzing large datasets, utilizing their financial expertise to drive insights and facilitate strategic decisions. Your role will be vital in ensuring data integrity, implementing best practices, and offering actionable recommendations to aid our company in achieving its financial objectives. To qualify for this position, you should possess a Bachelor's degree in Finance, Accounting, Data Science, Statistics, Computer Science, or a related field. An advanced degree or relevant certifications such as CFA or CPA would be advantageous. Additionally, a minimum of 5 years of experience in data management, financial analysis, or a related role is required, with proven expertise in managing large datasets and financial modeling. The role demands proficiency in data management tools like SQL, ETL processes, and data warehousing, along with advanced knowledge of financial software and systems such as ERP and BI tools like Tableau and Power BI. Strong skills in data analysis and statistical methods are essential, as well as excellent problem-solving abilities to interpret complex data and make informed decisions. Effective communication skills, both verbal and written, are crucial for presenting complex information clearly and concisely. Attention to detail is paramount, ensuring a high level of accuracy in data analysis and financial reporting. In this position, your key responsibilities will include managing and analyzing large financial datasets, developing and maintaining financial models, analyzing financial data to identify trends, patterns, and anomalies, and providing actionable insights to stakeholders. You will apply financial acumen to analyze complex datasets, create and maintain dashboards and visualizations, prepare detailed financial reports, forecasts, and budgets, and collaborate with finance and accounting teams to ensure data consistency and alignment with financial goals. Furthermore, you will be responsible for creating and maintaining comprehensive documentation of data processes, analysis methodologies, and financial models, collaborating with cross-functional teams to understand data needs, providing data-driven recommendations to support business strategies, identifying opportunities for process improvements, and automating tasks to enhance data management and analysis efficiency. Join us for exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. Enjoy a collaborative environment where you can expand your skills and maintain a healthy work-life balance with flexible schedules and opportunities for professional development. We offer competitive salaries, various benefits, and fun perks to create a vibrant and rewarding workplace. Come be a part of GlobalLogic, a leader in digital engineering, and help build innovative products and digital experiences for global brands across diverse industries.,
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
As a Senior Professional Campaign Developer, you will be responsible for the development, implementation, and execution of personalized marketing campaigns. Your ability to work on campaign optimizations, analysis, workflow, journeys, and act as a consultant for the tool technologies and marketing automation solutions will be crucial. You will be a Senior Campaign Developer with a proven track record in designing, developing, QA, and configuring Marketing Technology (AJO). Your advanced technical skills will be utilized to construct sophisticated ETL processes, enhance product functionality, and implement large-scale, real-time delivery of dynamic content communications. Demonstrating a comprehensive knowledge of multi-channel marketing strategies will empower you to lead insightful business discussions and engage directly with stakeholders on functional matters. Your role within the Marketing Platform team will be pivotal in driving client value through the implementation of marketing platforms for executing campaigns. You will focus on streamlining campaign launches, optimizing audience selection, and enhancing performance to deliver high-quality, targeted campaigns that meet client objectives and drive ROI. Your responsibilities will include campaign estimation, requirement gathering, solution design, platform setup with foundational schemas, performance monitoring, result analysis, and reporting. Additionally, you will think creatively to design and deliver complex solutions, enhance implementation and migration activities, offer technology consulting, stay current with new technologies, ensure precise documentation, and mentor team members to achieve client success and desired outcomes. Your responsibilities will include successfully leading and executing the design and development of marketing automation campaigns, implementing tracking and analytics solutions to measure the effectiveness of marketing campaigns and customer engagement, collaborating with marketing stakeholders to gather requirements and translate business needs into technical solutions, designing and implementing complex marketing campaigns via various digital channels, conducting code reviews, ensuring adherence to best practices and coding standards, configuring and optimizing dynamic content personalization, data manipulation, and integration with external systems, implementing audience segmentation strategies, developing custom reporting and dashboards, staying informed about industry trends, best practices, and emerging technologies, proactively identifying opportunities for process improvements, driving initiatives to increase operational efficiency and campaign effectiveness, mentoring junior developers, providing guidance on tool development best practices and methodologies, and troubleshooting and resolving technical issues related to tool configuration, data integration, and campaign execution. To qualify for this role, you should have a 4-year college degree in a technical field or equivalent and an Adobe campaign related certification. Additionally, you should have experience in technology integrations with Marketing technologies (AJO, ACC, AEP), a minimum of 1 year of experience with the AJO platform and more than 4 years working on Campaign implementations or manage projects, hands-on experience on the Adobe Campaign tool, 3+ years of SQL development experience, 3+ years of scripting/programming experience (JavaScript preferred), 1+ years of experience in cross-channel marketing campaign delivery, like Push Notifications, In-App deliveries, eagerness to learn new applications of traditional technologies, ability to articulate and convey technical concepts to a mixed audience, familiarity with enterprise software and project implementation life cycles, knowledge of CRM and/or marketing applications and processes, and knowledge of basic marketing concepts such as Campaign RoI metrics and KRAs.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
You will collaborate with Business Analysts/business SMEs to transform functional requirements into technical solutions. This involves identifying all source systems, tables, and fields required for Celonis integration. Your responsibilities will include connecting, extracting, transforming, and loading data from various source systems to construct the data model. Writing efficient, scalable, and comprehensible SQL queries, optimizing existing queries, and enhancing data pipelines will be crucial aspects of your role. Furthermore, you will be tasked with designing and implementing new Process Connectors in Celonis, as well as refining and expanding existing connectors. Utilizing PQL (Process Query Language), you will develop formulas to retrieve data from the process mining engine and create process mining analyses and dashboards. To excel in this position, you should possess over 4 years of recent experience in SQL coding with a strong background in ETL processes. A solid understanding of ETL jobs, data warehouses/lakes, data modeling, and schema design is essential. Your expertise should extend to relational databases and data modeling, with prior experience working with the Celonis platform. It is preferred that you are Celonis Certified and have proficiency with the Cloud version of the platform. Previous involvement in business intelligence, data engineering, business analytics, or related fields will be advantageous. Familiarity with Query languages, such as PQL, is also beneficial. Strong communication skills and the ability to effectively manage client interactions are important qualities for this role.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
chandigarh
On-site
As a Senior Data Engineer, you will play a crucial role in supporting the Global BI team for Isolation Valves as they transition to Microsoft Fabric. Your primary responsibilities will involve data gathering, modeling, integration, and database design to facilitate efficient data management. You will be tasked with developing and optimizing scalable data models to cater to analytical and reporting needs, utilizing Microsoft Fabric and Azure technologies for high-performance data processing. Your duties will include collaborating with cross-functional teams such as data analysts, data scientists, and business collaborators to comprehend their data requirements and deliver effective solutions. You will leverage Fabric Lakehouse for data storage, governance, and processing to back Power BI and automation initiatives. Additionally, your expertise in data modeling, particularly in data warehouse and lakehouse design, will be essential in designing and implementing data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Furthermore, you will be responsible for developing ETL processes using tools like SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar platforms to prepare data for analysis and reporting. Implementing data quality checks and governance practices to ensure data accuracy, consistency, and security will also fall under your purview. You will supervise and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Your role will require a strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms, along with experience in data integration and ETL tools like Azure Data Factory. A deep understanding of Microsoft Fabric or similar data platforms, as well as comprehensive knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions, will be necessary. Effective communication skills to convey technical concepts to both technical and non-technical stakeholders, the ability to work both independently and within a team environment, and the willingness to stay abreast of new technologies and business areas are also vital for success in this role. To excel in this position, you should possess 5-7 years of experience in Data Warehousing with on-premises or cloud technologies, strong analytical abilities to tackle complex data challenges, and proficiency in database management, SQL query optimization, and data mapping. A solid grasp of Excel, including formulas, filters, macros, pivots, and related operations, is essential. Proficiency in Python and SQL/Advanced SQL for data transformations/Debugging, along with a willingness to work flexible hours based on project requirements, is also required. Furthermore, hands-on experience with Fabric components such as Lakehouse, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, and Semantic Models, as well as advanced SQL skills and experience with complex queries, data modeling, and performance tuning, are highly desired. Prior exposure to implementing Medallion Architecture for data processing, experience in a manufacturing environment, and familiarity with Oracle, SAP, or other ERP systems will be advantageous. A Bachelor's degree or equivalent experience in a Science-related field, with good interpersonal skills in English (spoken and written) and Agile certification, will set you apart as a strong candidate for this role. At Emerson, we are committed to fostering a workplace where every employee is valued, respected, and empowered to grow. Our culture encourages innovation, collaboration, and diverse perspectives, recognizing that great ideas stem from great teams. We invest in your ongoing career development, offering mentorship, training, and leadership opportunities to ensure your success and make a lasting impact. Employee wellbeing is a priority for us, and we provide competitive benefits plans, medical insurance options, Employee Assistance Program, flexible time off, and other supportive resources to help you thrive. Emerson is a global leader in automation technology and software, dedicated to helping customers in critical industries operate more sustainably and efficiently. Our commitment to our people, communities, and the planet drives us to create positive impacts through innovation, collaboration, and diversity. If you seek an environment where you can contribute to meaningful work, develop your skills, and make a difference, join us at Emerson. Let's go together towards a brighter future.,
Posted 1 month ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
As a Senior Business Intelligence Expert, you will leverage your extensive experience in PowerBI development to create high-performance and visually compelling business intelligence solutions. Your expertise in semantic modeling, data pipeline development, and API integration will play a crucial role in transforming complex data into actionable insights through intuitive dashboards that adhere to consistent branding guidelines and utilize advanced visualizations. You will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive key business decisions throughout the organization. Your proficiency in data modeling, ETL processes, and visualization best practices will be essential in delivering top-notch BI assets that meet performance standards and offer exceptional user experiences. Key Responsibilities: - Lead optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. - Enhance BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. - Develop and maintain complex data models using PowerBI's semantic modeling capabilities for data accuracy, consistency, and usability. - Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. - Design and implement automated processes for integrating data from various API sources. - Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. - Provide technical leadership and mentoring to junior BI developers. - Document technical specifications, data dictionaries, and user guides for all BI solutions. Required Qualifications: - Minimum 15+ years of experience in business intelligence, data analytics, or related field. - Good experience in Databricks. - Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. - Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. - Strong expertise in semantic modeling principles and best practices. - Extensive experience with custom visualizations and complex dashboard design. - Proficient in SQL for data manipulation and optimization. - Experience with Python for data processing and ETL workflows. - Proven track record of API integration and data ingestion from diverse sources. - Strong understanding of data warehouse concepts and dimensional modeling. - Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). Nice to Have Skills: - Experience implementing AI-powered analytics tools and integrating them with PowerBI. - Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. - Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). - Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. - Knowledge of writeback architecture and implementation in PowerBI solutions. - Experience with PowerBI APIs for custom application integration and automation. - Familiarity with DevOps practices for BI development and deployment. - Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an exciting opportunity to work with cutting-edge business intelligence technologies and deliver impactful solutions that drive organizational success through data-driven insights.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
About Us: JP Morgan is a dynamic and innovative company dedicated to providing top-tier solutions to our clients. We pride ourselves on our collaborative culture, cutting-edge technology, and commitment to excellence. We are currently seeking a highly skilled and motivated Solutions Analyst to join our team. As a Solutions Analyst, you will play a pivotal role in designing, developing, and implementing data-driven solutions that meet the needs of our clients. You will leverage your expertise in dashboard development and SQL to create insightful and actionable business intelligence tools. This role requires a strategic thinker with strong analytical skills and the ability to lead projects from conception to completion. Key Responsibilities: Lead the design, development, and deployment of dashboards and reports to provide actionable insights to stakeholders. Understand end-to-end process and rate of delivery for overall solutions. Ensure current priorities are in line with expectations of key stakeholders. Develop and optimize SQL queries to extract, transform, and load data from various sources. Ensure data accuracy, integrity, and security across all data/reporting solutions. Conduct thorough testing and validation of dashboards and reports to ensure high-quality deliverables. Collaborate with business users to gather requirements and translate them into technical specifications. Stay current with industry trends and best practices in business intelligence and data analytics. Qualifications: Bachelors degree in Computer Science, Information Systems, Data Analytics, or a related field. Minimum of 3 years of experience in dashboard development preferably with Qlik Sense. Proficiency in SQL and experience with database management systems and writing complex queries. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Experience with data visualization tools and techniques. Knowledge of data warehousing concepts and ETL processes is a plus. Preferred knowledge of ITIL based practices and associated data (Incident, Change, Problem, Resiliency, Capacity Management). Ability to manage multiple in-flight responsibilities and a high volume of detailed work effectively. Preferred UX experience and Qlik Mashup (or equivalent) experience. Project management experience is a plus. Experience in any scripting language Python, Java script or front-end development will be a plus.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a QlikView Administrator at Strawberry InfoTech in Gurugram, you will be responsible for installing, configuring, and maintaining QlikView servers, developing and supporting QlikView applications, troubleshooting issues, and ensuring system performance and availability. You will provide 1st and 2nd level support for Qlik applications, including QlikView, Qlik Sense, and Qlik NPrinting. Your responsibilities will include troubleshooting and resolving issues related to data loads, application performance, and user access, as well as monitoring application performance, data load processes, and system health. You will assist end-users with navigation, report generation, and application functionalities, track and document issues, resolutions, and changes, and escalate complex issues to higher levels as needed. Additionally, you will ensure data accuracy and integrity by validating data loads and performing necessary checks, collaborate with developers, data engineers, and business analysts to implement fixes and enhancements, support the implementation of system upgrades, patches, and enhancements, maintain up-to-date support and process documentation, user guides, and best practices, and provide training and guidance to end-users on how to effectively use Qlik applications. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 5+ years of experience as a QlikView Administrator. You should possess proficiency in QlikView, Qlik Sense, and Qlik NPrinting, a good understanding of data modeling and ETL processes in Qlik, familiarity with SQL and data visualization concepts, and experience in troubleshooting application and data-related issues. Strong analytical and problem-solving skills, excellent communication and interpersonal skills, and the ability to work independently and in a team environment are essential for this role. Qlik certifications (e.g., QlikView Developer, Qlik Sense Data Architect) are considered a plus. If you are interested in this opportunity, please share your updated resume at deepak.k@strawberryinfotech.com.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
You will play a crucial role as a Data Engineer, leading the development of data infrastructure at the forefront. Your responsibilities will involve creating and maintaining systems that ensure a seamless flow, availability, and reliability of data. Your key tasks at Coforge will include: - Developing and managing data pipelines to facilitate efficient data extraction, transformation, and loading (ETL) processes. - Designing and enhancing data storage solutions such as data warehouses and data lakes. - Ensuring data quality and integrity by implementing data validation, cleansing, and error handling mechanisms. - Collaborating with data analysts, data architects, and software engineers to comprehend data requirements and provide relevant data sets for business intelligence purposes. - Automating and enhancing data processes and workflows to drive scalability and efficiency. - Staying updated on industry trends and emerging technologies in the field of data engineering. - Documenting data pipelines, processes, and best practices to facilitate knowledge sharing. - Contributing to data governance and compliance initiatives to adhere to regulatory standards. - Working closely with cross-functional teams to promote data-driven decision-making across the organization. Key skills required for this role: - Proficiency in data modeling and database management. - Strong programming capabilities, particularly in Python, SQL, and PL/SQL. - Sound knowledge of Airflow, Snowflake, and DBT. - Hands-on experience with ETL (Extract, Transform, Load) processes. - Familiarity with data warehousing and cloud platforms, especially Azure. Your experience of 5-10 years will be instrumental in successfully fulfilling the responsibilities of this role located in Greater Noida with a shift timing from 2:00 PM IST to 10:30 PM IST.,
Posted 1 month ago
4.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
As a skilled professional in SQL development and ETL processes, you will be responsible for designing and implementing ETL processes to extract, transform, and load data from diverse sources into data warehouses. Your role will involve developing and optimizing SQL queries for efficient data retrieval and reporting. Collaboration with data analysts and business stakeholders is essential to comprehend data requirements and provide effective solutions. In this position, you will play a key role in monitoring and troubleshooting ETL processes to ensure data integrity and optimal performance. Your tasks will include creating and maintaining documentation for data processes and workflows. Proficiency in SQL and experience with ETL tools are crucial for success in this role, along with a solid background in data warehousing concepts and practices. Utilizing tools such as Power BI for creating interactive dashboards and reports to visualize data will be part of your responsibilities. The ideal candidate for this position should have 4-12 years of experience in SQL development and ETL processes, familiarity with Power BI or similar data visualization tools, and possess strong analytical and problem-solving skills. If you are looking for a challenging opportunity where you can leverage your expertise in SQL, ETL processes, and data visualization tools to drive impactful business outcomes, this role might be the perfect fit for you.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior ETL Testing Automation professional with 7 to 10 years of experience, you will be responsible for designing, developing, and executing ETL testing automation scripts. You will collaborate with cross-functional teams to ensure quality assurance of ETL processes. Your role will involve identifying and reporting defects, troubleshooting issues, and ensuring timely resolution. Additionally, you will be required to conduct performance testing and optimize ETL workflows for efficiency and accuracy. To excel in this role, you should possess 7-10 years of experience in ETL testing and automation, along with proficiency in SQL for data querying and manipulation. A strong knowledge of ETL processes and tools, as well as experience in automation testing and ETL tools, is essential. You should have the ability to work independently and in a team environment, demonstrating excellent problem-solving and communication skills. Join Coders Brain, a global leader in IT services and digital solutions, as a Senior ETL Testing Automation professional based in Chennai, Tamil Nadu, India. This is a full-time employment opportunity that offers the chance to work with a dynamic team in the vibrant city of Bengaluru.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Analytics professional at Monotype Solutions India, you will play a crucial role in aggregating data from various sources to develop visually appealing, automated dashboards that support CX leaders in enhancing retention and upsell goals. Your responsibilities will involve analyzing large datasets to extract meaningful insights, patterns, and trends, with a strong emphasis on scalability and automation. Additionally, you will collaborate with various teams to ensure consistent data hygiene, logic, and governance across the organization. Your role will also entail developing dynamic and interactive visualizations to simplify complex information for diverse audiences. You will engage closely with QA on data validation and explore opportunities to incorporate AI and automation to enhance data accuracy and reduce manual interventions. Furthermore, you will drive process automation initiatives to improve data accuracy, minimize redundancies, and enhance reporting efficiency. To excel in this position, you should hold a Bachelor's degree in a relevant field with at least 7 years of data analytics experience, preferably in a SaaS organization emphasizing automation and process improvement. Proficiency in business intelligence tools like Power BI, Tableau, or Sigma is essential, along with the ability to integrate Salesforce CRM with analytics workflows. Strong stakeholder management skills and the capability to translate business needs into scalable, automated solutions are key requirements. Your technical acumen, including an understanding of ETL processes, data engineering best practices, and automation tools, will be critical in organizing large datasets into structured, scalable models for deeper insights and efficient decision-making. Your problem-solving skills and passion for designing scalable, automated solutions will be instrumental in enhancing business efficiency. At Monotype, you can look forward to hybrid work arrangements, competitive compensation, comprehensive medical insurance coverage, and various development and advancement opportunities. You will be part of a creative, innovative, and global working environment in the creative and software technology industry, with access to highly engaged Events Committee and Reward & Recognition Programs. Additionally, proficiency in languages like German, Japanese, French, or Spanish is desirable for this role, reflecting Monotype's global expansion. Join Monotype Solutions India and be a part of a dynamic and collaborative team that is dedicated to leveraging data analytics, automation, and process optimization to drive business success.,
Posted 1 month ago
12.0 - 16.0 years
0 Lacs
pune, maharashtra
On-site
As a Database Architect at Orion Innovation, a premier global business and technology services firm, you will be responsible for designing, implementing, and maintaining database systems with a focus on DB2, SQL Server, Azure SQL, Cosmos DB, and other cloud databases. Your role will involve collaborating closely with development and IT teams to ensure that databases are scalable, secure, and efficient. Key Responsibilities: - Design and implement database solutions based on business requirements. - Develop and maintain data models, database schemas, and data dictionaries. - Optimize database performance through indexing, query optimization, and other techniques. - Ensure data integrity, security, and availability. - Collaborate with development teams to integrate databases with applications. - Monitor and troubleshoot database issues, providing timely resolutions. - Stay updated with the latest database technologies and best practices. - Mentor and guide junior database administrators and developers. Specific Responsibilities: - Design and manage databases using DB2, SQL Server, Azure SQL, and other cloud database platforms. - Implement and manage cloud-based database solutions, ensuring high availability and disaster recovery. - Develop and maintain ETL processes using ADF for data integration and migration. - Implement database security measures, including encryption and access controls. - Perform database tuning and optimization for cloud environments. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as a Database Architect or similar role. - Strong knowledge of DB2, SQL Server, Azure SQL, and other cloud database platforms. - Proficiency in data modeling, database design, and performance tuning. - Experience with cloud database solutions and services. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. - Certification in database management (e.g., Microsoft Certified: Azure Database Administrator Associate) is a plus. Orion Systems Integrators, LLC and its affiliates are committed to protecting your privacy during the application and recruitment process. For more details on our Candidate Privacy Policy, please refer to the full policy on our website.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
At TELUS Digital, you will play a crucial role in enabling customer experience innovation by fostering spirited teamwork, embracing agile thinking, and embodying a caring culture that prioritizes customers. As the global arm of TELUS Corporation, a leading telecommunications service provider in Canada, we specialize in delivering contact center and business process outsourcing solutions to major corporations across various sectors such as consumer electronics, finance, telecommunications, and utilities. With our extensive global call center capabilities, we offer secure infrastructure, competitive pricing, skilled resources, and exceptional customer service, all supported by TELUS, our multi-billion dollar parent company. In this role, you will leverage your expertise in Data Engineering, backed by a minimum of 4 years of industry experience, to drive the success of our projects. Proficiency in Google Cloud Platform (GCP) services including Dataflow, BigQuery, Cloud Storage, and Pub/Sub is essential for effectively managing data pipelines and ETL processes. Your strong command over the Python programming language will be instrumental in performing data processing tasks efficiently. You will be responsible for optimizing data pipeline architectures, enhancing performance, and ensuring reliability through your software engineering skills. Your ability to troubleshoot and resolve complex pipeline issues, automate repetitive tasks, and monitor data pipelines for efficiency and reliability will be critical in maintaining operational excellence. Additionally, your familiarity with SQL, relational databases, and version control systems like Git will be beneficial in streamlining data management processes. As part of the team, you will collaborate closely with stakeholders to analyze, test, and enhance the reliability of GCP data pipelines, Informatica ETL workflows, MDM, and Control-M jobs. Your commitment to continuous improvement, SLA adherence, and post-incident reviews will drive the evolution of our data pipeline systems. Excellent communication, problem-solving, and analytical skills are essential for effectively documenting processes, providing insights, and ensuring seamless operations. This role offers a dynamic environment where you will have the opportunity to work in a 24x7 shift, contributing to the success of our global operations and making a meaningful impact on customer experience.,
Posted 1 month ago
10.0 - 12.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Key Job Responsibilities: VOC - VI (Vulnerability Intelligence), ASM (Attack Surface Management) & VM (Vulnerability Management) Expert. Environment / Context Saint Gobain, world leader in the habitat and construction market, is one of the top 100 global industrial groups. Saint-Gobain is present in 68 countries with 171 000 employees. They design, manufacture and distribute materials and solutions which are key ingredients in the wellbeing of each of us and the future of all. They can be found everywhere in our living places and our daily life: in buildings, transportation, infrastructure and in many industrial applications. They provide comfort, performance and safety while addressing the challenges of sustainable construction, resource efficiency and climate change Saint-Gobain GDI Grou p (250 persons at the head office, including 120 that are internal) is responsible for defining, setting up and managing the Group&aposs Information Systems (IS) and Telecom policy with its 1,000 subsidiaries in 6,500 sites worldwide. The GDI Groupe also carries the common means (infrastructures, telecoms, digital platforms, cross-functional applications ). IN DEC, the IT Development Centre of Saint-Gobain, is an entity with a vision to leverage Indias technical skills in the Information Technology domain to provide timely, high-quality and cost-effective IT solutions to Saint-Gobain businesses globally.Within the Cybersecurity Department, t he Cybersecurity Vulnerability Operations Cent er mission is to Identify, assess and confirm vulnerability and threats that can affect the Group. The CyberVOC teams are based out of Paris and Mumbai and consist of skilled persons working in different Service Lines. Mission We are seeking a highly experienced cybersecurity professional to serve as an VOC Expert supporting the Vulnerability Intelligence (VI), Attack Surface Management (ASM), and Vulnerability Management (VM) teams. This role is pivotal in shaping the strategy, defining technical approaches, and supporting day-to-day operationsparticularly complex escalations and automation efforts. The ideal candidate will combine technical mastery in offensive security with practical experience in vulnerability lifecycle management and external attack surface discovery. The expert will act as a senior advisor and technical authority for the analyst teams, while also contributing to the design, scripting, and documentation of scalable security proceess. The VOC Expert is responsible for: Vulnerability Intelligence (VI) Drive the qualification and risk analysis of newly disclosed vulnerabilities. Perform exploit PoC validation when needed to assess practical risk. Maintain and enhance the central VI database, enriched with (EPSS, CVSS, QVS, SG-specific scoring models, and EUVD) Define and automate workflows for: Vulnerability qualification, exposure analysis, and prioritization Ingestion of qualified vulnerability data into the enterprise Data Lake Collaborate on documentation of VI methodology and threat intelligence integration Support proactive communication of high/critical vulnerabilities to asset and application owners Attack Surface Management (ASM): Operate and enhance external asset discovery and continuous monitoring using ASM tools Integrate asset coverage data from CMDB, and other internal datasets Design and implement scripts for: WHOIS/ASN/banner correlation Data enrichment and alert filtering Deploy and maintain custom scanning capabilities (e.g., Nuclei integrations) Provide expert input on threat modeling based on exposed assets and external footprint BlackBox Pentesting: Maintain the service delivery of the BlackBox Pentesting platform Automate the export of pentest data and integrate into Data Lake and Power BI dashboards Define and document onboarding workflows for new applications Actively guide analysts in prioritizing pentest requests and validating results. Vulnerability Management: Vulnerability review, recategorization, and false positive identification Proactive vulnerability testing and replay Pre-analyze and consolidate vulnerability data from various scanning tools Prepare concise syntheses of available vulnerabilities Offer guidance to the SO and CISO on vulnerabilities Collaborate with key stakeholders to develop strategies for vulnerability management Assist in defining vulnerability management KPIs and strategic goals Prepare concise, actionable summaries for high-risk vulnerabilities and trends Automate testing actions: Develop scripts and tooling to automate repetitive and complex tasks across VI, ASM and VM. Implement data pipelines to sync outputs from ASM/VI tools to dashboards and reporting engines. Design streamlined workflows for vulnerability lifecyclefrom detection to closure. Collaborate with both offensive and defensive teams to support App managers and Asset managers in remediating vulnerabilities and issues. Skills and Qualifications: Bachelor&aposs degree in Computer Science, Information Security, EXTC or related field; relevant certifications (e.g., CISSP, CCSP, CompTIA Security+) are a plus Proven experience (10+ years) working within the Cybersecurity field, with a focus on offensive security, vulnerability intelligence and attack surface analysis. Proven experience on Penetration testing actions (web application, infrastructure, ) Proven expertise in: CVE analysis, exploit development/validationExternal asset discovery & mapping Threat modeling and prioritizationAdvanced knowledge of tooling such as: ASM platforms Nuclei, Shodan, Open Source CTI, vulnerability scanners (Qualys, Tenable, ) Pentester tools (Burp, SQLmap, Responder, IDA and Kali environment) Experience in investigating newly published vulnerabilities, assessing their risks, severity. Strong scripting languages (e.g., Python, Bash, Powershell, C#, ) for automation and customization Experience with Pentester tools (Burp, SQLmap and Kali environment) Strong technical skills with an interest in open-source intelligence investigations Experience building dashboards in Power BI or similar tools. Familiarity with data lakes, API integrations, and ETL processes. Knowledge of NIST CVE database, OWASP Top 10, Microsoft security bulletins Excellent writing skills in English and ability to communicate complicate technical challenges in a business language to a range of stakeholders. Personal Skills: Has a systematic, disciplined, and analytical approach to problem solving with Thorough leadership skills & experience Excellent ability to think critically underpressure Strong communication skills to convey technical concepts clearly to both technical and non-technical stakeholders Willingness to stay updated with evolving cyber threats, technologies, and industry trends Capacity to work collaboratively with cross-functional teams, developers, and management to implement robust security measures Additional Information: The position is based in Mumbai (India) Show more Show less
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a BI and Reporting Specialist at Hogarth, you will be responsible for delivering a suite of standardised and ad hoc reports through our BI and reporting platform while utilizing best practices for data visualisation. Your expertise will be crucial in translating business needs into design, developing tools, techniques, metrics, and dashboards for insights and data visualization. You will play a key role in developing efficient data models for reporting and analytics using a modern data stack, identifying data quality issues, and suggesting improvements related to data governance and best practices. In this role, you will support the modelling of data to facilitate medium to long-term decision-making for the team and leadership community. It will be your responsibility to validate the presence and accuracy of data before its use within a report, maintain technical documentation, and provide data management support to users. Additionally, you will assist in maintaining accurate reporting delivery plans, timelines, and provide feedback to the delivery team. To excel in this position, you must have previous experience in a similar role, ideally working in a Data team of a large company, preferably with a background in the Creative or Media environment. Proficiency in various data systems, ETL processes, multi-channel analytics tools and platforms, as well as strong use of BI tools like MS Power BI, MS Power Automate, Thoughtspot, and Tableau is essential. A solid understanding of SQL, data warehouse structure principles, database modelling, and experience working with agile methodologies such as Scrum/Kanban is required. Your role will also involve collaborating as a team player with excellent interpersonal skills in an intercultural working environment. Excellent communication skills, the ability to present technical matters in a comprehensible way to non-technical stakeholders, problem-solving attitude, and fluency in Business English are necessary for success in this position. While not mandatory, experience in Javascript, knowledge of media performance metrics and reporting solutions, gathering and analyzing system requirements, and familiarity with Jira and other project/task management tools would be advantageous. At WPP, we are committed to fostering a culture of respect and inclusivity, providing equal opportunities for all applicants without discrimination. Join us in using the power of creativity to build better futures for our people, planet, clients, and communities.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The Data Visualization, Business Analytics role is vital for the organization as it involves transforming intricate data into visual insights for key stakeholders, facilitating informed decision-making and strategic planning. You will collaborate with business leaders to recognize and prioritize data visualization requirements. Your responsibilities will include designing interactive dashboards and reports to illustrate essential business metrics and trends. You will create visually appealing charts, graphs, and presentations that are easily understandable. Furthermore, it is essential to develop and uphold data visualization best practices and standards. As part of the role, you will utilize various data visualization tools and platforms to present insights effectively. Conducting data analysis to identify patterns and trends for visualization purposes will be a key task. Implementing user interface (UI) and user experience (UX) principles to enhance visualization is crucial. Providing training and support to team members on data visualization techniques is also part of the responsibilities. Additionally, you will be responsible for performing ad-hoc analysis and data mining to support business needs. Collaboration with data engineers and data scientists to ensure data accuracy and integrity is essential. It is important to stay updated with industry trends and best practices in data visualization and business analytics. Presenting findings and insights to key stakeholders in a clear and compelling manner will be a regular task. Communication with cross-functional teams to understand data requirements is vital. You will contribute to the continuous improvement of data visualization processes and techniques. The role requires a Bachelor's degree in Data Science, Business Analytics, Computer Science, or a related field. Proven experience in data visualization, business intelligence, or related roles is necessary. Proficiency in data visualization tools like Tableau, Power BI, or D3.js is essential. Strong analytical and problem-solving skills are required. Expertise in SQL for data querying and manipulation is a must. An understanding of statistical concepts and data modeling is crucial. Excellent communication and presentation skills are necessary. The ability to work effectively in a fast-paced and dynamic environment is essential. Knowledge of business operations and strategic planning is required. Experience in interpreting and analyzing complex datasets is beneficial. Familiarity with data warehousing and ETL processes is a plus. Managing multiple projects and deadlines simultaneously, being detail-oriented with a focus on data accuracy and quality, working collaboratively in a team setting, and possessing strong business acumen and understanding of key performance indicators are important skills for this role.,
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
hyderabad, telangana
On-site
A career within Financial Markets Business Advisory services will provide you with the opportunity to contribute to a variety of audit, regulatory, valuation, and financial analyses services to design solutions that address clients" complex accounting and financial reporting challenges, as well as their broader business issues. To really stand out and make fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. The PwC Professional, our global leadership development framework, gives us a single set of expectations across our lines, geographies, and career paths. It provides transparency on the skills required as individuals to be successful and progress in our careers, now and in the future. Responsibilities As an Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: - Invite and give in the moment feedback in a constructive manner. - Share and collaborate effectively with others. - Identify and make suggestions for improvements when problems and/or opportunities arise. - Handle, manipulate, and analyze data and information responsibly. - Follow risk management and compliance procedures. - Keep up-to-date with developments in the area of specialism. - Communicate confidently in a clear, concise, and articulate manner - verbally and in the materials produced. - Build and maintain an internal and external network. - Seek opportunities to learn about how PwC works as a global network of firms. - Uphold the firm's code of ethics and business conduct. We are seeking a highly motivated Data Engineer - Associate to join our dynamic team. The ideal candidate will have a strong foundation in data engineering, particularly with Python and SQL, and exposure to cloud technologies and data visualization tools such as Power BI, Tableau, or QuickSight. The Data Engineer will work closely with data architects and business stakeholders to support the design and implementation of data pipelines and analytics solutions. This role offers an opportunity to grow technical expertise in cloud and data solutions, contributing to projects that drive business insights and innovation. Key Responsibilities Data Engineering: - Develop, optimize, and maintain data pipelines and workflows to ensure efficient data integration from multiple sources. - Use Python and SQL to design and implement scalable data processing solutions. - Ensure data quality and consistency throughout data transformation and storage processes. - Collaborate with data architects and senior engineers to build data solutions that meet business and technical requirements. Cloud Technologies - Work with cloud platforms (e.g., AWS, Azure, or Google Cloud) to deploy and maintain data solutions. - Support the migration of on-premise data infrastructure to the cloud environment when needed. - Assist in implementing cloud-based data storage solutions, such as data lakes and data warehouses. Data Visualization - Provide data to business stakeholders for visualizations using tools such as Power BI, Tableau, or QuickSight. - Collaborate with analysts to understand their data needs and optimize data structures for reporting. Collaboration And Support - Work closely with cross-functional teams, including data scientists and business analysts, to support data-driven decision-making. - Troubleshoot and resolve issues in the data pipeline and ensure timely data delivery. - Document processes, data flows, and infrastructure for team knowledge sharing. Required Skills And Experience - 0+ years of experience in data engineering, working with Python and SQL. - Exposure to cloud platforms such as AWS, Azure, or Google Cloud is preferred. - Familiarity with data visualization tools (e.g., Power BI, Tableau, QuickSight) is a plus. - Basic understanding of data modeling, ETL processes, and data warehousing concepts. - Strong analytical and problem-solving skills, with attention to detail. Qualifications - Bachelor's degree in Computer Science, Data Science, Information Technology, or related fields. - Basic knowledge of cloud platforms and services is advantageous. - Strong communication skills and the ability to work in a team-oriented environment.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The AVP, Business and Customer Analytics(L10) at Synchrony plays a crucial role in delivering high-impact projects by collaborating with various analytics teams to solve key business problems using data-driven solutions. As a part of the India Analytics Hub, you will work on projects that enable the company's growth and profitability through advanced analytics techniques. Your responsibilities will include supporting American Eagle business stakeholders with data-driven insights, leading analytics projects from inception to delivery, deriving actionable recommendations from data insights, and ensuring project timelines and accuracy are met. You will also contribute to internal initiatives, handle multiple projects, and demonstrate strong project management skills. The ideal candidate should hold a degree in Statistics, Mathematics, Economics, Engineering, or a related quantitative field with at least 4 years of hands-on experience in analytics or data science. Proficiency in SQL/SAS programming, Business Intelligence tools like Tableau & Power BI, Google Cloud Platform, ETL processes, and Big Data analytics is required. Experience in campaign sizing, customer targeting, and the financial services industry will be beneficial. Desired skills include working with Python/R, big data technologies like Hadoop/Hive/GCP, and report automation. Effective communication skills, the ability to lead strategic projects independently, and manage competing priorities are essential for this role. The role offers Enhanced Flexibility and Choice in work timings, requiring availability between 06:00 AM - 11:30 AM Eastern Time for meetings with global teams. Internal applicants should ensure they meet the eligibility criteria, inform their manager, update their professional profile, and upload an updated resume in the application process. If you are a motivated individual with a passion for analytics and a desire to drive business growth through data-driven solutions, this role provides an exciting opportunity to make a significant impact within the organization.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Analytics Manager specializing in Power BI, Python, and Tableau within the Insurance Domain, you will play a crucial role in designing, developing, and implementing Power BI dashboards. Your expertise in Power BI, Python, Tableau, and SQL is essential for this role. You will be responsible for leading, mentoring, and developing a team of data analysts and data scientists. Your key responsibilities will include providing strategic direction for analytical projects, defining and implementing the company's data analytics strategy, and conducting complex data analysis to identify trends and patterns. You will oversee the development of interactive dashboards, reports, and visualizations to make data insights easily accessible to stakeholders. Ensuring data integrity and consistency across systems, collaborating with cross-functional teams, and staying current with the latest data analytics trends and technologies are also important aspects of this role. Additionally, you will lead data-driven projects from initiation to execution, managing timelines, resources, and risks effectively. To be successful in this role, you should have a Bachelor's degree in data science, Statistics, Computer Science, Engineering, or a related field, with at least 10 years of experience in data analysis and 2 years in a managerial or leadership position. Proficiency in data analysis and visualization tools such as SQL, Python, R, Tableau, and Power BI is required. Strong knowledge of data modeling, ETL processes, and database management, along with exceptional problem-solving and critical thinking skills, are essential. Effective communication of complex technical concepts to non-technical stakeholders, proven experience in managing and growing a team of data professionals, strong project management skills, and domain knowledge in insurance will be advantageous for this role. If you are looking for a challenging opportunity to lead data analytics projects, collaborate with diverse teams, and drive business insights within the Insurance Domain, this role is ideal for you.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |