Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will be responsible for contributing to the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to assist in applications systems analysis and programming activities. You will utilize your knowledge of applications development procedures and concepts, along with basic knowledge of technical areas, to identify and define necessary system enhancements. This includes using script tools, analyzing code, and consulting with users, clients, and other technology groups to recommend programming solutions. Additionally, you will install and support customer exposure systems and apply fundamental knowledge of programming languages for design specifications. As an Intermediate Programmer Analyst, you will analyze applications to identify vulnerabilities and security issues, conduct testing and debugging, and serve as an advisor or coach to new or lower-level analysts. You will be responsible for identifying problems, analyzing information, and making evaluative judgments to recommend and implement solutions. Operating with a limited level of direct supervision, you will exercise independence of judgment and autonomy while acting as a subject matter expert to senior stakeholders and/or other team members. In this role, it is crucial to appropriately assess risk when making business decisions, with a focus on safeguarding Citigroup, its clients, and assets. This includes driving compliance with applicable laws, rules, and regulations, adhering to policies, applying sound ethical judgment, and escalating, managing, and reporting control issues with transparency. Qualifications: - 4-6 years of proven experience in developing and managing Big Data solutions using Apache Spark and Scala is required - Strong programming skills in Scala, Java, or Python - Hands-on experience with technologies like Apache Hive, Apache Kafka, HBase, Couchbase, Sqoop, Flume, etc. - Proficiency in SQL and experience with relational databases (Oracle/PL-SQL) - Experience in working on Kafka, JMS/MQ applications - Familiarity with data warehousing concepts and ETL processes - Knowledge of data modeling, data architecture, and data integration techniques - Experience with Java, Web services, XML, JavaScript, Microservices, SOA, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and the Hadoop ecosystem - Experience with developing frameworks and utility services, logging/monitoring, and high-quality software delivery - Experience creating large-scale, multi-tiered, distributed applications with Hadoop and Spark - Profound knowledge of implementing different data storage solutions such as RDBMS, Hive, HBase, Impala, and NoSQL databases Education: - Bachelor's degree or equivalent experience This job description provides a high-level overview of the responsibilities and qualifications for the Applications Development Intermediate Programmer Analyst position. Other job-related duties may be assigned as required.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You will be working as a Princ IT Business Analyst at Arctera, a company that ensures the smooth functioning of IT systems worldwide. Arctera plays a crucial role in enabling various organizations, ranging from the largest corporations to the smallest businesses, to combat cyber threats, natural disasters, and regulatory challenges by leveraging data-driven solutions like Insight, InfoScale, and Backup Exec. Your primary responsibility will involve collaborating with the EPM, Procurement, Fixed Assets, and Tax teams to analyze complex business processes and design comprehensive financial solutions using NetSuite ERP. You will lead discussions with stakeholders to define business requirements and translate them into technological solutions by aligning business processes with NetSuite functionalities. Key Responsibilities: - Analyze intricate business processes in collaboration with stakeholders and IT teams to identify requirements. - Translate business needs into NetSuite solutions by designing functions and workflows, ensuring seamless integration with other applications. - Lead the design and implementation of financial process flows within the NetSuite platform to optimize operations. - Support revenue recognition processes and month-end close activities. - Maintain data accuracy within the NetSuite system, oversee data imports/exports, and conduct regular audits to ensure data quality. - Provide technical support, end-user training, and troubleshoot system-related issues to ensure operational efficiency. - Develop and execute functional test plans, user guides, and other documentation to facilitate the use and maintenance of the NetSuite platform. Required Skills, Experience & Education: - Bachelor's degree in computer science, engineering, or a related field; Master's degree preferred. - 10+ years of experience as a business analyst/functional consultant in ERP implementation, with at least 5 years focused on NetSuite. - Proficiency in NetSuite modules such as procurement, billing, accounting, tax, fixed assets, general ledger, revenue, and multi-currency management. - Experience in Enterprise Performance Management (EPM) for financial planning, budgeting, reporting, and reconciliation across the organization. - Hands-on experience in customizing and configuring NetSuite, including creating custom fields, reports, forms, and dashboards. - Familiarity with NetSuite data migration tools, SuiteScript, SuiteFlow, and SuiteAnalytics. - Knowledge of integration technologies, scalable integration solutions, and documentation of technical requirements. - Experience in testing activities, including integration testing, end-to-end testing, and User Acceptance Testing (UAT). - Effective communication, collaboration skills, and adaptability to a dynamic startup environment. - Familiarity with cloud platforms and services; NetSuite certifications would be a plus.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
Arctera is a company dedicated to ensuring the smooth functioning of IT systems worldwide. From enabling credit card transactions to maintaining power supply and supporting pharmaceutical production, Arctera plays a crucial role in the operations of both large and small organizations. Through innovative data solutions such as Insight, InfoScale, and Backup Exec, Arctera helps businesses navigate challenges like ransomware attacks, compliance issues, and natural disasters. In an era where data is constantly growing, Arctera is at the forefront of safeguarding critical information and promoting data privacy and sustainability. By leveraging cutting-edge technologies, Arctera aims to protect global infrastructure and secure data for all. As a Princ IT Business Analyst specializing in NetSuite ERP for EPM, Procurement, Fixed Assets, and Tax, you will collaborate with various teams to analyze financial processes and design comprehensive solutions. Your responsibilities will include translating business requirements into NetSuite functionalities, implementing financial process flows, ensuring data integrity, and providing technical support to users. Key responsibilities of the role include: - Analyzing complex business processes and collaborating with stakeholders to identify requirements - Designing and implementing NetSuite solutions to support business operations - Leading financial process flows in the NetSuite platform and ensuring data accuracy - Providing technical support, training, and troubleshooting assistance to end-users - Documenting functional test plans, user guides, and other documentation to support NetSuite platform usage To qualify for this role, you should have: - A Bachelor's degree in computer science, engineering, or a related field (Master's degree preferred) - 10+ years of experience as a business analyst or functional consultant in ERP implementation - Strong expertise in NetSuite modules such as procurement, billing, accounting, revenue, tax, and fixed assets - Experience in Enterprise Performance Management (EPM) and customization of NetSuite functionalities - Proficiency in integration technologies, data mapping, and technical documentation - Effective communication skills, adaptability to a dynamic environment, and familiarity with cloud platforms Having NetSuite certifications would be an added advantage for this position. Join Arctera's team to be part of an innovative group dedicated to leveraging technology to protect critical infrastructure and ensure data security for organizations worldwide.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a PowerBI Developer, you will be responsible for developing and maintaining scalable data pipelines using Python and PySpark. You will collaborate with data engineers and data scientists to fulfill data processing needs and optimize existing PySpark applications for performance improvements. Writing clean, efficient, and well-documented code following best practices is a crucial part of your role. Additionally, you will participate in design and code reviews, develop and implement ETL processes, and ensure data integrity and quality throughout the data lifecycle. Staying current with the latest industry trends and technologies in big data and cloud computing is essential. The ideal candidate should have a minimum of 6 years of experience in designing and developing advanced Power BI reports and dashboards. Working experience on data modeling, DAX calculations, developing and maintaining data models, creating reports and dashboards, analyzing and visualizing data, ensuring data governance and compliance, as well as troubleshooting and optimizing Power BI solutions. Preferred skills for this role include strong proficiency in Power BI Desktop, DAX, Power Query, and data modeling. Experience in analyzing data, creating visualizations, building interactive dashboards, connecting to various data sources, and transforming data is highly valued. Excellent communication and collaboration skills are necessary to work effectively with stakeholders. Familiarity with SQL, data warehousing concepts, and experience with UI/UX development would be beneficial.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are not the person who settles for just any role. Neither are we. We are committed to creating Better Care for a Better World, which requires individuals and teams who care about making a difference. As the Data Governance Lead for Global People Organization Data, you will play a crucial role in executing the GPO data strategy by establishing Data Stewardship, Data Governance Policies, Data Audits, and guard rails. Your responsibilities will include defining and assigning data roles, adopting and monitoring a data governance framework to ensure quality, security, and compliance. In this role, you will be responsible for developing data governance policies such as creating policies regarding data classification, access, data quality standards, and data sharing procedures. You will identify key stakeholders involved in data governance, establish a governance structure, and monitor and measure effectiveness by reviewing data governance performance metrics. Additionally, you will focus on data quality by implementing processes to ensure data accuracy, completeness, consistency, and timeliness, data security, data compliance, and data lineage. Furthermore, you will establish a data stewardship council to identify ownership of specific data sets to designated data stewards who are responsible for their quality and governance. You will work closely with various stakeholders across the organization to ensure compliance with relevant data privacy regulations and industry standards. Your role will involve leading with informal authority, building consensus, influencing decision-making, conflict resolution, and achieving objectives through matrixed leadership. To succeed in this position, you should have at least 5 years of experience leading data governance, including establishing frameworks and best practices, familiarity with data privacy regulations and compliance, and demonstrable data quality monitoring and improvement. You should possess strong communication skills and the ability to communicate effectively across multiple layers of the organization. If you are passionate about driving innovation, growth, and impact, and want to be part of a company actively dedicated to sustainability, inclusion, wellbeing, and career development, Kimberly-Clark offers you an opportunity to be part of a team committed to making a difference. Your performance in this role will be pivotal in ensuring the success of our data governance initiatives and contributing to the overall mission of Better Care for a Better World.,
Posted 1 month ago
9.0 - 13.0 years
9 - 15 Lacs
Chennai
Work from Office
Petrofac is a leading international service provider to the energy industry, with a diverse client portfolio including many of the worlds leading energy companies. We design, build, manage, and maintain infrastructure for our clients. We recruit, reward, and develop our people based on merit, regardless of race, nationality, religion, gender, age, sexual orientation, marital status, or disability. We value our people and treat everyone who works for or with Petrofac fairly and without discrimination. The world is re-thinking its energy supply and energy security needs and planning for a phased transition to alternative energy sources. We are here to help our clients meet these evolving energy needs. This is an exciting time to join us on this journey. Are you ready to bring the right energy to Petrofac and help us deliver a better future for everyone? JOB TITLE: Data Engineer KEY RESPONSIBILITIES: Architecting and defining data flows for big data/data lake use cases. Excellent knowledge on implementing full life cycle of data management principles such as Data Governance, Architecture, Modelling, Storage, Security, Master data, and Quality. Act as a coach and provide consultancy services and advice to data engineers by offering technical guidance, and ensuring architecture principles, design standards and operational requirements are met. Participate in the Technical Design Authority forums. Collaborates with analytics and business stakeholders to improve data models that feed BI tools, increasing data accessibility, and fostering data-driven decision making across the organization. Work with team of data engineers to deliver the tasks and achieving weekly and monthly goals, also to guide the team to follow the best practices and improve the deliverables. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability. Responsible for estimating the cluster size, core size, monitoring, and troubleshooting of the data bricks cluster and analysis server to produce optimal capacity for computing data ingestion. Deliver master data cleansing and improvement efforts; including automated and cost-effective solutions for processing, cleansing, and verifying the integrity of data used for analysis. Expertise in securing the big data environment including encryption, tunnelling, access control, secure isolation. To guide and build highly efficient OLAP cubes using data modelling techniques to cater all the required business cases and mitigate the limitation of Power BI in analysis service. Deploy and maintain highly efficient CI/CD devops pipelines across multiple environments such as dev, stg and production. Strictly follow scrum based agile approach of development to work based on allocated stories. Comprehensive knowledge on data extraction, Transformation and loading data from various sources like Oracle, Hadoop HDFS, Flat files, JSON, Avro, Parquet and ORC. Experience defining, implementing, and maintaining a global data platform Experience building robust and impactful data visualisation solutions and gaining adoption Extensive work experience onboarding various data sources using real-time, batch load or scheduled loads. The sources can be in cloud, on premise, SQL DB, NO SQL DB or API-based. Expertise in extracting the data through JSON, ODATA, REST API, WEBSERVICES, XML. Expertise in data ingestion platforms such as Apache Sqoop, Apache Flume, Amazon kinesis, Fluent, Logstash etc. Hands on experience in using Databricks, Pig, SCALA, HIVE, Azure Data Factory, Python, R Operational experience with Big Data Technologies and Engines including Presto, Spark, Hive and Hadoop Environments Experience in various databases including Azure SQL DB, Oracle, MySQL, Cosmos DB, MongoDB Experience supporting and working with cross-functional teams in a dynamic environment. ESSENTIAL QUALIFICATION & SKILLS: Bachelors degree (masters preferred) in Computer Science, Engineering, or any other technology related field 10+ years of experience in data analytics platform and hands-on experience on ETL and ELT transformations with strong SQL programming knowledge. 5+ years of hands-on experience on big data engineering, distributed storage and processing massive data into data lake using Scala or Python. Proficient knowledge on Hadoop and Spark eco systems like HDFS, Hive, Sqoop, Oozie, Spark core, streaming. Experience with programming languages such as Scala, Java, Python and Shell scripting Proven Experience in pulling data through REST API, ODATA, XML,Web services. Experience with Azure product offerings and data platform. Experience in data modelling (data marts, snowflake/Star, Normalization, SCD2). Architect and defining the data flows and building highly efficient, scalable data pipelines. To work in tandem with the Enterprise and Domain Architects to understand the business goals and vision, and to contribute to the Enterprise Roadmaps. Strong troubleshooting skills, problem solving skills of any issues stopping business progress. Coordinate with multiple business stake holders to understand the requirement and deliver. Conducting a continuous audit of data management system performance, refine whenever required, and report immediately any breach or loopholes to the stakeholders. Allocate task to various team members, track the status and provide the report on activities to management. Understand the physical and logic plan of execution and optimize the performance of data pipelines. Extensive background in data mining and statistical analysis. Able to understand various data structures and common methods in data transformation. Ability to work with ETL tools with strong knowledge on ETL concepts. Strong focus on delivering outcomes. Data management: modelling, normalisation, cleaning, and maintenance Understand Data architectures, Data warehousing principles and be able to participate in the design and development of conventional data warehouse solutions.
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer, you will be responsible for designing, building, and maintaining data pipelines on the Microsoft Azure cloud platform. Your primary focus will be on utilizing technologies such as Azure Data Factory, Azure Synapse Analytics, PySpark, and Python to handle complex data processing tasks efficiently. Your key responsibilities will include designing and implementing data pipelines using Azure Data Factory or other orchestration tools, writing SQL queries for ETL processes, and collaborating with data analysts to meet data requirements and ensure data quality. You will also need to implement data governance practices for security and compliance, monitor and optimize data pipelines for performance, and develop unit tests for code. Working in an Agile environment, you will be part of a team that develops Modern Data Warehouse solutions using Azure Stack, coding in Spark (Scala or Python) and T-SQL. Proficiency in source code control systems like GIT, designing solutions with Azure data services, and managing team governance are essential aspects of this role. Additionally, you will provide technical leadership, guidance, and support to team members, resolve blockers, and report progress to customers regularly. Preferred skills and experience for this role include a good understanding of PySpark and Python, proficiency in Azure Data Engineering tools (Azure Data Factory, DataBricks, Synapse Analytics), experience in handling large datasets, exposure to DevOps basics, and knowledge of Release Engineering fundamentals.,
Posted 1 month ago
15.0 - 20.0 years
0 Lacs
karnataka
On-site
We are looking for individuals who share our core belief that Every Day is Game day. At Pine Labs, we aim to enrich the world through the power of digital commerce and financial services by bringing our best selves to work each day. As a Senior Database Leader, you will be responsible for leading our Database Engineering and Analytics teams. Your role will involve a deep technical understanding of database architectures, managing production databases, and optimizing database environments on managed cloud platforms to enable highly available and scalable data solutions. Your responsibilities will include leading and managing the Database Engineering and Analytics teams to ensure optimal performance and availability of all database environments. You will oversee the architecture and operation of production databases, manage various databases such as SQL Server, Postgres, and MongoDB, and deploy highly available database solutions with redundancy for critical business systems. Additionally, you will collaborate with cross-functional teams on data analytics initiatives, supporting machine learning projects and analytical workloads while leveraging cutting-edge technology for best-in-class data solutions for our customers. To excel in this role, you should have a minimum of 15-20 years of industry experience, including at least 5 years of direct experience in managing database engineering teams, data lakes, and analytical workloads. You should possess expertise in managing various databases, setting up highly available database architectures, and working with cloud-based database setups on platforms like AWS, Azure, or GCP. Additionally, experience in Fintech, knowledge of reporting solutions, strong leadership skills, problem-solving abilities, and excellent communication skills are essential for success in this role. At Pine Labs, we value individuals who make decisions fast, take ownership of their work, build solutions for merchants, seek continuous learning, and pride themselves in the work they do.,
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Gurugram
Work from Office
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career, Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express, How will you make an impact in this role How we serve our customers is constantly evolving and is a challenge we gladly accept Whether youre finding new ways to prevent identity fraud or enabling customers to start a new business, you can work with one of the most valuable data sets in the world to identify insights and actions that can have a meaningful impact on our customers and our business And, with opportunities to learn from leaders who have defined the course of our industry, you can grow your career and define your own path Find your place in risk and analytics on #TeamAmex, Functional Description: Enterprise Data Risk Management (?EDRM?), within the Global Risk & Compliance Organization, is the independent risk management function covering risk of financial loss, reputational damage, or regulatory or legal action resulting from inadequate data governance and/or data management practices adversely impacting the accuracy, timeliness, comprehensiveness, or usability of data within or throughout its lifecycle, EDRM is hiring an Analyst who will play a key role?in setting up the new transaction testing function within the Enterprise Data Risk Management team to ensure financial transactions are accurate, complete and adhering to regulatory standards, Role & Responsibilities: As testing will traverse products and systems across American Express, this analyst will develop a risk-based approach to determine the prioritization and cadence of reviews for transaction testing of regulatory reports via comprehensive test plans, test cases and test scripts for transaction testing based on the regulatory reports, products and systems considering applicable regulatory requirements and internal American Express Policies Support risk mitigation strategies by identifying, evaluating and prioritizing data risks to develop tailored testing methodologies aligned to regulatory reporting processes and underlying transaction data complexity, Analyze large datasets to identify discrepancies, anomalies, and gaps in reported values by performing validations against source systems/points of origin Implement transaction testing across regulatory reports to further validate accuracy and completeness of reported values against the points of origin, Design and prepare 2LoD transaction testing review reports summarizing the approach, testing methodology and outcomes inclusive of findings, if any Document testing processes outcomes including issues, results and overall accuracy Contribute to detailed transaction testing across various card products and systems to validate data feeding into regulatory reports Prepare and report updates on transaction testing and identified data risks to senior management, Perform data management controls testing across regulatory reports to validate overall control design, operational effectiveness and coverage, Stay abreast of changes in banking regulations and reporting requirements ( e-g , FFIEC, FRB, OCC, FDIC) to ensure transaction testing aligns with current mandates, regulations, industry standards, emerging trends and overall best practices, Minimum Qualifications: Degree in Finance, Accounting, Business Administration, Risk Management, or other related discipline is required, 2-3 years of experience in regulatory reporting team, audit, compliance or risk management within the banking or financial services industry Demonstrated experience in transaction testing, data validation, and analysis is preferred As well as additional experience or understanding of financial analytics, reporting, data analytics, data controls and data transformation logics Requesting strong knowledge of data governance, data compliance, and data-related issue management in large financial services firms Utilize data and business analytics background to develop winning strategies and drive business decision making, Knowledge, experience, or familiarity in regulatory reporting (FR2052a, FRY15, FRY9C, FRY14 etc), audit, US GAAP and financial accounting is preferred, Proficient in using data analysis tools ( e-g , Excel, SQL), and knowledge of database systems, Strong analytical, problem-solving, and critical thinking skills are important, Adept verbal and written communication skills, including the ability to explain complex problems and ideas clearly and succinctly to senior management Effectively manage multiple, and often conflicting, priorities under tight timeframes and adapt to frequent change, We back you with benefits that support your holistic well-being so you can be and deliver your best This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law, Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations, Show
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
nashik, maharashtra
On-site
As a Business Development Associate at our company, you will have the opportunity to engage with OEM affiliates such as sales executives and car evaluators to onboard them as partners. Your responsibilities will also include identifying and onboarding new affiliate channels, exploring offline auctions, and managing procurement coordination. You will be in charge of scheduling and overseeing vehicle inspections from various lead sources, negotiating pricing, and finalizing procurements post-inspection. Taking ownership of inside leads and driving conversions will be a key part of your role, along with ensuring end-to-end data compliance for all leads. To excel in this position, you should have 1-3 years of experience in business development, vendor onboarding, or business acquisition, with a strong preference for supply-side experience. A Bachelor's degree from a Tier-2 or above college is required. Your success will be supported by your strong communication and negotiation skills, as well as your ability to work independently and be proactive. Prior startup experience is considered a plus. Joining our team means being part of a fast-paced, high-growth company where you will work with industry experts and build strong networks. We offer a competitive salary along with performance-based incentives in a dynamic and entrepreneurial work environment. Key Skills: vendor onboarding, vendor management, procurement coordination, data compliance, travel, communication skills, vehicle assessment, sellers, procurement, closure, sales, market research, relationship building, fieldwork, negotiation skills, negotiation, vendors, sales skills, business development, management, business acquisition, communication.,
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 month ago
10.0 - 12.0 years
15 - 18 Lacs
Kolkata
Work from Office
The Group Chief Data Officer (CDO) is responsible for defining and implementing enterprise-wide data and analytics strategy across the group companies. Ensure data quality , compliance, aligns data strategy with business objectives of group entities.
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a Business Operations Specialist at our organization, your primary responsibility will be to support business intelligence and data governance initiatives. You will collaborate with cross-functional teams to analyze data, build dashboards, and assist in implementing data governance frameworks to ensure data quality, consistency, and compliance. This role is based in Pune, India, and is a 6-month contract position with 40 hours per week, including 2-3 days working onsite at the client's office. Your key responsibilities will include developing a deep understanding of business processes and data needs, designing and developing data models, reports, and dashboards using BI tools, and performing detailed data analysis to identify trends and actionable insights. You will also be involved in creating and maintaining data documentation, assisting in the implementation of a robust data governance framework, and collaborating with teams to align data strategies with organizational goals. To excel in this role, you should possess strong analytical skills, attention to detail, and proficiency in BI tools such as Power BI or Tableau, SQL, Snowflake, and data governance platforms. A problem-solving mindset, excellent communication skills, and the ability to work with cross-functional teams are essential. Additionally, a willingness to learn and adapt to emerging trends in data analytics and tools is crucial. A Bachelor's Degree is required for this position. If you meet the requirements and are interested in this opportunity, please submit your resume through our network at https://www.stage4solutions.com/careers/. Feel free to share this opening with others who might be a good fit for the Business Operations Specialist Sales Ops role in Pune, India.,
Posted 1 month ago
10.0 - 12.0 years
15 - 18 Lacs
Kolkata
Work from Office
The Group Chief Data Officer (CDO) is responsible for defining and implementing enterprise-wide data and analytics strategy across the group companies. Ensure data quality , compliance, aligns data strategy with business objectives of group entities.
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 months ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
You should possess a Bachelor's degree in Computer Science, Engineering, or a related field along with at least 8 years of work experience in Data First systems. Additionally, you should have a minimum of 4 years of experience working on Data Lake/Data Platform projects specifically on AWS/Azure. It is crucial to have extensive knowledge and hands-on experience with Data warehousing tools such as Snowflake, BigQuery, or RedShift. Proficiency in SQL for managing and querying data is a must-have skill for this role. You are expected to have experience with relational databases like Azure SQL, AWS RDS, as well as an understanding of NoSQL databases like MongoDB for handling various data formats and structures. Familiarity with orchestration tools like Airflow and DBT would be advantageous. Experience in building stream-processing systems using solutions such as Kafka or Azure Event Hub is desirable. Your responsibilities will include designing and implementing ETL/ELT processes using tools like Azure Data Factory to ingest and transform data into the data lake. You should also have expertise in data migration and processing with AWS (S3, Glue, Lambda, Athena, RDS Aurora) or Azure (ADF, ADLS, Azure Synapse, Databricks). Data cleansing and enrichment skills are crucial to ensure data quality for downstream processing and analytics. Furthermore, you must be capable of managing schema evolution and metadata for the data lake, with experience in tools like Azure Purview for data discovery and cataloging. Proficiency in creating and managing APIs for data access, preferably with experience in JDBC/ODBC, is required. Knowledge of data governance practices, data privacy laws like GDPR, and implementing security measures in the data lake are essential aspects of this role. Strong programming skills in languages like Python, Scala, or SQL are necessary for data engineering tasks. Additionally, experience with automation and orchestration tools, familiarity with CI/CD practices, and the ability to optimize data storage and retrieval for analytical queries are key requirements. Collaboration with the Principal Data Architect and other team members to align data solutions with architectural and business goals is crucial. As a lead, you will be responsible for critical system design changes, software projects, and ensuring timely project deliverables. Collaboration with stakeholders to translate business needs into efficient data infrastructure systems is a key aspect of this role. Your ability to review design proposals, conduct code review sessions, and promote best practices is essential. Experience in an Agile model, delivering quality deliverables on time, and translating complex requirements into technical solutions are also part of your responsibilities.,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a PowerBI Developer, you will be responsible for developing and maintaining scalable data pipelines using Python and PySpark. Your role will involve collaborating with data engineers and data scientists to understand and fulfill data processing needs. You will be expected to optimize and troubleshoot existing PySpark applications for performance improvements and write clean, efficient, and well-documented code following best practices. Participation in design and code reviews is essential to ensure high-quality deliverables. Moreover, you will play a key role in developing and implementing ETL processes to extract, transform, and load data. It will be your responsibility to ensure data integrity and quality throughout the data lifecycle. Staying current with the latest industry trends and technologies in big data and cloud computing is crucial to excel in this role. The ideal candidate should have a minimum of 6 years of experience in designing and developing advanced Power BI reports and dashboards. Proficiency in data modeling and DAX calculations is required, along with experience in developing and maintaining data models, creating reports and dashboards, analyzing and visualizing data, and ensuring data governance and compliance. Troubleshooting and optimizing Power BI solutions will also be part of your responsibilities. Preferred skills for this role include a strong proficiency in Power BI Desktop, DAX, Power Query, and data modeling. Experience in analyzing data, creating visualizations, and building interactive dashboards is highly valued. Additionally, you should possess excellent communication and collaboration skills to effectively work with stakeholders. Familiarity with SQL and data warehousing concepts, as well as experience with UI/UX development, will be beneficial in successfully fulfilling the responsibilities of this position.,
Posted 2 months ago
5.0 - 8.0 years
8 - 12 Lacs
Mohali, Gurugram
Hybrid
Pearce Services is seeking a skilled and experienced UKG Developer to support our internal systems and HR platforms. This position requires advanced proficiency in UKG development, including timekeeping, payroll, and HRIS configurations. As the developer will report directly to a US-based manager, excellent English communication skills are critical. The ideal candidate is self-driven, comfortable managing tasks independently, and capable of turning complex requirements into scalable solutions. Responsibilities Collaborate with cross-functional teams to gather business requirements and translate them into UKG system configurations. Develop and maintain UKG workflows, interfaces, and integration scripts with other enterprise systems. Build custom reports, dashboards, and automation solutions aligned with business needs. Provide technical support and troubleshoot issues in UKG applications, partnering with stakeholders across HR, Finance, and IT. Maintain system documentation, user guides, and training materials. Ensure data integrity, compliance, and security within the UKG system. Stay updated on UKG platform enhancements, industry best practices, and regulatory changes. Qualifications Minimum 5 years of experience as a UKG/UltiPro developer or similar HRIS development role. Proven ability to work independently and manage multiple priorities with minimal supervision. Excellent verbal and written communication skills in English, with the ability to engage effectively across time zones. Solid understanding of UKG modules including Core HR, Payroll, Time & Attendance, and reporting. Proven understanding of fundamental HR and Payroll terms and principles. Understanding of security principles and safe handling of private protected information. Experience in creating APIs, web services, and integrations between UKG and other systems. Strong problem-solving mindset and attention to detail. Familiarity with Agile or Scrum methodologies (preferred).
Posted 2 months ago
4.0 - 5.0 years
3 - 5 Lacs
Kolkata
Remote
Job Title: Data Protection Officer (Contract-Based | Hourly | On-Call) Location: Remote / India (with availability for EU/UK time zone coordination) Type: Contractual | Hourly Basis | As-needed Engagement Experience: 4 - 5 years of relevant experience Job Summary: We are looking for an experienced and independent Data Protection Officer (DPO) to support our organization in ensuring compliance with the General Data Protection Regulation (GDPR) and other applicable data privacy laws. This is a contract-based, hourly paid position , and the DPO will be engaged on an as-needed basis . The role requires flexibility to provide consultation, conduct reviews, and respond to data protection matters when required. Key Responsibilities: Serve as the point of contact for UK/EU residents , supervisory authorities, and internal teams regarding data protection issues. Identify and evaluate the companys data processing activities . Provide expert advice on conducting Data Protection Impact Assessments (DPIA) . Monitor compliance with GDPR and applicable local data protection laws . Review and advise on data management procedures and internal policies. Offer consultation on incident response and handling of privacy breaches . Track regulatory changes and provide recommendations to maintain compliance. Maintain and update a register of processing operations , including risk-prone processes for prior checks. Support internal awareness and training initiatives regarding data protection obligations. Requirements- Work experience in data protection and legal compliance is a must. Solid knowledge of GDPR and data protection laws. Ability to handle confidential information. Ensure that controllers and data subjects are informed about their data protection rights, obligations and responsibilities and raise awareness about them; Create a register of processing operations within the institution and notify the EDPS those that present specific risks (so-called prior checks); Ethical, with the ability to remain impartial and report all non-compliances Organizational skills with attention to details. Experience: 4-5 years expertise in managing international data protection compliance programs and implementing data governance policies, technology compliance standards and programs, and privacy-by-design frameworks. To be successful in this role, you should have in-depth knowledge of GDPR and local data protection laws and be familiar with our industry and the nature of its data processing activities.
Posted 2 months ago
6.0 - 11.0 years
10 Lacs
Hyderabad
Work from Office
Develop and maintain SQL queries to extract, validate and transform encounter data for submission. Support the generation and submission of 837 institutional and professional encounter files. Review, analyze and correct encounter rejections using 999, 277CA, and TA1 reports. Collaborate with cross-functional teams (claims, configuration, IT, compliance) to resolve data discrepancies. Monitor encounter submission status and prepare regular reports and dashboards. Ensure data compliance with CMS, Medicaid guidelines and state-specific submission requirements. Maintain detailed documentation of ETL processes and business rules used in encounter workflows. Contact Person- Divya R Contact Number- 9940653213 Email- rdivya@gojobs.biz
Posted 2 months ago
3.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
Position: Data Lifecycle Management (DLM) Specialist | Mumbai | WFO Location: Goregaon, Mumbai (Apply if you are from Western line) Shift Timing: 9 AM 6 PM Notice Period: Immediate to 30 Days Experience: 3 to 5 Years Work Mode: Work from Office (WFO) Interested candidates can apply to saikeertana.r@twsol.com Role Overview: Seeking a highly motivated and client-centric DLM Specialist with 35 years of experience in data management , financial services , or other regulated industries . This role focuses on reviewing applications and ensuring data retention, disposition, and archiving compliance while aligning with privacy regulations and internal policy. Key Responsibilities: Assess data retention, archiving, and disposition requirements across all business divisions Conduct regular reviews and stakeholder meetings with business and technology teams Manage data risk identification and mitigation plans related to retention, location, and transfer Document concise data management requirements and ensure implementation tracking Support in defining operational and compliance controls Compile analysis reports and drive recommendation implementation Engage system owners in problem-solving and decision-making Represent DLM in cross-functional meetings to communicate policy standards Prepare progress reports and contribute to process improvements Required Qualifications: Bachelors degree 3 to 5 years experience in information/data management , data storage , or financial services operations Strong business analysis skills Excellent verbal and written communication skills in English High attention to detail with the ability to document complex information clearly Demonstrated client servicing ability and stakeholder management Experience in developing business and functional requirements for tech systems Nice to Have: Degree in Information Systems , Business Administration , Archiving , or Law Understanding of personal data protection and privacy regulations Familiarity with database and cloud technologies , AI trends Reporting experience with Power BI / Tableau Experience working with high-volume datasets
Posted 2 months ago
5.0 - 8.0 years
5 - 10 Lacs
Chennai
Hybrid
As a Research Specialist III You will be responsible for researching, verifying, and updating data for ZoomInfo's industry-leading sales intelligence platform. The right candidate for this role has an engaging personality, an eye for quality, and a drive to learn with us as we continue to improve the top-quality research processes that keep ZoomInfo ahead of our competition. What You'll Do: Data Research: Conduct thorough research to collect and validate company firmographic data, including details such as company size, industry classification, and location. Executive Contact Data: Gather and verify executive contact information, including names, titles, emails, and phone numbers, ensuring data accuracy Data Integrity: Maintain a high level of attention to detail to uphold data quality and consistency standards. Adhere to standards: Adhere to research protocols, privacy laws and maintain confidentiality to protect operations and ensure customer confidence Collaboration: Collaborate effectively with cross-functional teams to contribute to the improvement and growth of our sales intelligence database What You Bring Minimum 5 to 8 years of previous experience in a Data Research role Shift time & Overlap: 1 PM IST to 10 PM IST - At times there might be a overlap in working at PST time zones as required to align with project needs. Excellent understanding of company size, structure and location, classification of companies (industry, ownership type and business) and good understanding of corporate actions like mergers, acquisitions and parent-subsidiary relationships Ability to establish priorities and work independently with little supervision Experience working with spreadsheets, and the ability to analyze data tables and draw conclusions Attention to detail and numeracy abilities Maintain a high level of accuracy while balancing changes in workload This is a mandatory hybrid role (3 days Work from Office and 2 days Work from home) and general shift. Designation : Research Specialist III Location : Global Info City, Block A, 11th Floor, MGR Salai, Perungudi, Chennai Reporting to : Team Leader - Data Research .
Posted 2 months ago
5.0 - 10.0 years
12 - 18 Lacs
Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)
Work from Office
Role & responsibilities Data Management: Maintenance of ERP Master data, & to extend the support as close to real-time as possible Data Governance: Establishing and enforcing data governance policies and procedures to ensure data accuracy, consistency, and reliability. This involves defining data standards, data quality rules, and access controls. Data Collection and Integration: Gathering data from various sources, including internal systems (including business finance teams) and external data providers. Integrating data to create a unified view of credit exposure and commodity risk across the organization. Data Cleansing and Validation: Identifying and rectifying data inconsistencies, errors, and redundancies. Ensuring that the data is validated against predefined quality criteria. Regulatory Compliance: Ensuring that the data management processes adhere to relevant regulatory requirements and industry standards. Data Security: Implementing data security measures to protect sensitive credit and commodity data from unauthorized access or breaches. Data Maintenance: Monitoring data quality and performing regular data maintenance tasks to keep the Master Desk system up to date. System Integration: Collaborating with IT teams to integrate the Master Desk Data with other enterprise systems Continuous Improvement: Identifying areas for process improvement and optimization within the Master Desk framework to enhance efficiency and accuracy. Education preference - Preferably Chartered Accountant/ICWA or MBA (Finance)
Posted 2 months ago
3.0 - 8.0 years
2 - 6 Lacs
Hyderabad
Remote
Role & responsibilities: Minimum 2 years experience in HR Services and any HR application like Success factor, SAP, Workday etc, Proficiency in SAP / Success Factors; MS Tools like SP, Excel; Knowledge of CRM Tools like Dynamics, ServiceNow, etc. Graduation but Post-Graduation (any specialization) will be an advantage. Excellent written & verbal English communication - IMP Exposure to Customer Relationship Management tools will be an added advantage (ServiceNow, CRM, SIEBEL etc.) Knowledge of MS tools (SharePoint, Excel & PowerPoint) Attention to detail and ability to follow guidelines Ability to maintain highly confidential and sensitive information Ability to deliver against agreed objectives/ service levels Ability to work effectively in a team and willingness to help others Contract Description: We are looking for a Contract Staff for HR Services to work on EMEA-related HR Operations tasks and queries. The HR Services Delivery Center team plays a pivotal role in improving the Candidate, Employee, and Manager experience by providing timely and accurate query resolution, onboarding of candidates, maintaining accurate HR data of employees in HR Systems & supporting employee life cycle programs & processes (benefits, rewards, transfers, offboarding, etc.) Key Accountabilities: Maintains efficient service delivery by ensuring transactional requests and assigned inquiries are completed within SLA depending on priority & complexity. Respond & resolve queries in a timely & accurate manner with employee experience at the core. Accountable to ensure Employee HR records are accurately created and / or maintained in HR Systems (SAP, Success Factors, MS Vacation, etc.) Takes complete ownership to close the data administration requests, including follow-up with the requestor to collect missing information and / or informing approval requirements. Maintains and follows the Desktop Procedures / KB articles defined for every transaction/query. Ensures the Maker-Checker process is followed, and Data Monitoring is done to ensure high quality of data in all HR tools. Working in a highly data sensitive environment, responsible for always protecting Data. Privacy and adhering to confidentiality requirements to promote zero breach of compliance policies.
Posted 2 months ago
5.0 - 9.0 years
7 - 12 Lacs
Bengaluru
Work from Office
As a Bulk WhatsApp Solution Specialist, you will play a crucial role in developing, implementing, and managing WhatsApp-based bulk messaging campaigns for various business objectives. You will work closely with clients to create impactful messaging strategies while ensuring compliance with WhatsApp's terms of service and best practices. Your ultimate goal will be to maximize campaign success while maintaining the integrity and quality of the messaging experience. Key Responsibilities: Strategy & Planning: - Develop and implement WhatsApp-based bulk messaging strategies that align with broader business goals and customer engagement initiatives. - Continuously monitor industry trends and WhatsApp platform updates to refine messaging strategies. - Collaborate with internal teams to tailor messaging approaches for specific campaigns or customer segments. Compliance & Best Practices: - Ensure all bulk messaging campaigns comply with WhatsApp's terms of service, privacy policies, and applicable data regulations. - Implement best practices for WhatsApp bulk messaging to improve engagement rates, message deliverability, and overall campaign effectiveness. - Proactively address potential risks related to WhatsApp policies, ensuring campaigns are executed within legal and platform guidelines. Campaign Management & Optimization: - Manage end-to-end execution of bulk messaging campaigns, including setup, monitoring, and performance analysis. - Utilize data-driven insights to optimize message content, timing, and frequency for maximum effectiveness. - Track key performance metrics and report on campaign results to stakeholders, providing actionable insights for future strategies. Skills & Qualifications: Technical Expertise: - In-depth knowledge of WhatsApp Business and its capabilities, including automation, messaging templates, and customer interaction flows. - Experience working with bulk messaging tools and platforms, with a focus on WhatsApp. - Familiarity with marketing automation tools such as Zapier, Google Sheets, or similar platforms. Communication & Analytical Skills: - Strong verbal and written communication skills with the ability to craft compelling messages that resonate with customers. - Analytical mindset with the ability to interpret campaign data and use insights to improve future initiatives. Collaboration & Independence: - Ability to work autonomously and manage multiple projects simultaneously. - Strong interpersonal skills, with the ability to collaborate with cross-functional teams and clients. Data Privacy & Compliance Awareness: - Solid understanding of data privacy regulations and how they apply to bulk messaging campaigns. - Ability to ensure campaigns meet legal requirements and maintain customer trust.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |