Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As an experienced Information Security professional with 8+ years of experience, you will be responsible for planning, implementing, managing, and maintaining security systems such as antimalware solutions, vulnerability management solutions, and SIEM solutions. Your role will involve monitoring and investigating security alerts from various sources, providing incident response, and identifying potential weaknesses within the organization's network and systems to recommend effective solutions. Additionally, you will take up security initiatives to enhance the overall security posture of the organization. You will be required to document Standard Operating Procedures (SOPs), metrics, and reports as necessary, provide Root Cause Analyses (RCAs) for security incidents, and collaborate with different teams and departments to address vulnerabilities, security incidents, and drive security initiatives. Moreover, researching and monitoring emerging threats and vulnerabilities, understanding current industry and technology trends, and assessing their impact on applications will be crucial aspects of your role. Your qualifications should include industry-recognized professional certifications such as CISSP, GCSA, CND, or similar certifications. Demonstrated experience in computer security with a focus on risk analysis, audit, and compliance objectives is essential. Proficiency in Network and Web Security tools like Palo Alto, ForeScout, and Zscaler, as well as experience in AWS Cloud Environment and Privileged Access Management solutions, will be advantageous. Familiarity with SIEM/SOAR, NDR, EDR, VM, and Data Security solutions and concepts is desired. The ideal candidate will possess strong decision-making and complex problem-solving skills under pressure, along with a high degree of creativity and "out-of-the-box" thinking. The ability to manage multiple projects simultaneously in fast-paced environments, a service-oriented approach, and excellent communication, presentation, and writing skills are key requirements for this role. You should also be adept at sharing knowledge, collaborating with team members and customers, and adapting to a fast-paced, ever-changing global environment. Strong organization, time management, and priority-setting skills are essential, along with a proactive approach to achieving results. In summary, this role offers an exciting opportunity for an experienced Information Security professional to contribute to the enhancement of the organization's security posture, collaborate with diverse teams, and stay abreast of emerging threats and industry trends.,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
indore, madhya pradesh
On-site
Golden Eagle IT Technologies Pvt. Ltd. is looking for a skilled Data Engineer with 2 to 4 years of experience to join the team in Indore. The ideal candidate should have a solid background in data engineering, big data technologies, and cloud platforms. As a Data Engineer, you will be responsible for designing, building, and maintaining efficient, scalable, and reliable data pipelines. You will be expected to develop and maintain ETL pipelines using tools like Apache Airflow, Spark, and Hadoop. Additionally, you will design and implement data solutions on AWS, leveraging services such as DynamoDB, Athena, Glue Data Catalog, and SageMaker. Working with messaging systems like Kafka for managing data streaming and real-time data processing will also be part of your responsibilities. Proficiency in Python and Scala for data processing, transformation, and automation is essential. Ensuring data quality and integrity across multiple sources and formats will be a key aspect of your role. Collaboration with data scientists, analysts, and other stakeholders to understand data needs and deliver solutions is crucial. Optimizing and tuning data systems for performance and scalability, as well as implementing best practices for data security and compliance, are also expected. Preferred skills include experience with infrastructure as code tools like Pulumi, familiarity with GraphQL for API development, and exposure to machine learning and data science workflows, particularly using SageMaker. Qualifications for this position include a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 2-4 years of experience in data engineering or a similar role. Proficiency in AWS cloud services and big data technologies, strong programming skills in Python and Scala, knowledge of data warehousing concepts and tools, as well as excellent problem-solving and communication skills are required.,
Posted 3 days ago
4.0 - 9.0 years
11 - 16 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Implement automation tools and frameworks Collaborate within the programme and across the organisation to improve engineering tools, systems, procedures and data security Seek ways to provide automation that helps assure security, performance and availability of cloud infrastructure and services Develop and maintain design and troubleshooting documentation Requirements Bachelor s degree in an IT or engineering discipline Minimal 3+ years experience of software developments with both water-Fall and Agile methodologies. Possess strong technical capabilities knowledge and experience on MicroService development/governance, DevOps, Disciplined Agile Delivery (DAD) Expert with Java and web based technologies (SpringBoot, SpringCloud) and both SQL and No SQL Databases Have experience with Production Support and incident management Have experience with Production Support and incident management Being proactive, flexible and a team player Solid experience in big data, AI and public cloud related technologies. The successful candidate will also meet the following requirements: A wide knowledge of IT hardware, software, operations and networks A good understanding of Domain-Driven Development, Behaviour-Driven Development and Test-Driven Development .
Posted 3 days ago
6.0 - 11.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Snapshot Artificial Intelligence could be one of humanity s most useful inventions. At Google DeepMind, we re a team of scientists, engineers, machine learning experts and more, working together to advance the state of the art in artificial intelligence. We use our technologies for widespread public benefit and scientific discovery, and collaborate with others on critical challenges, ensuring safety and ethics are the highest priority. Gemini, the game-changing personal assistant powered by generative AI, revolutionizes your mobile experience. The Gemini mobile apps deliver tailored support directly to your device. Our team s mission is to lead the charge in evolving Gemini on Android and iOS, placing Google at the forefront of the LLM and generative AI revolution. About us The Gemini on Mobile (Android+iOS) team is responsible for the entire Gemini (and previously Assistant) experience. Our team members are self-sufficient and problem solvers. We are looking for people who are passionate about app development, and who are always one step ahead in development platforms, new functionality and APIs. The role The engineer will be responsible for the full software development lifecycle, from designing scalable business logic and user interfaces to ensuring application quality, performance, security, and reliability. Key responsibilities Design and implement scalable business logic for millions of Gemini users for both 1P and 3P ecosystems. Design and construct user interfaces (UIs) on Android Platform Collaborate with UI/UX designers to develop intuitive and responsive interfaces that provide a seamless experience for Gemini. Monitor and troubleshoot issues to maintain consistent and delightful experience and address internal and external user feedback effectively. Design and implement logging and metrics for production monitoring and the identification of key insights. Oversee application testing, qualification, automation and releases on a periodic cadence. Ensuring quality in production and maintaining a high app rating Implement performance (battery, memory, latency) optimizations wherever needed to ensure fast, smooth and seamless user experience. Ensure data security and privacy through the implementation of appropriate data handling and storage practices. About you We are seeking a developer skilled in Android mobile app development. Ideal candidate has experience with Android mobile UI, application development, and capable of leading complex projects. In order to set you up for success as a Software Engineer at Gemini, we look for the following skills and experience: Bachelor s degree or equivalent practical experience. 6+ years of experience with software development with exposure to developing 4 years of experience with data structures or algorithms in either an academic or industry setting, and with Android application development. 2+ years of experience leading workstreams with at least 3-5 engineers In addition, the following would be an advantage: Familiarity with Android application development frameworks. Excellent communication and collaboration skills to work effectively with cross-functional teams. Ability to learn new technologies, adapt to evolving requirements, and drive ambiguous problems end to end. Note: In the event your application is successful and an offer of employment is made to you, any offer of employment will be conditional on the results of a background check, performed by a third party acting on our behalf. For more information on how we handle your data, please see our Applicant and Candidate Privacy Policy . At Google DeepMind, we value diversity of experience, knowledge, backgrounds and perspectives and harness these qualities to create extraordinary impact. We are committed to equal employment opportunities regardless of sex, race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, pregnancy, or related condition (including breastfeeding) or any other basis as protected by applicable law. If you have a disability or additional need that requires accommodation, please do not hesitate to let us know.
Posted 3 days ago
3.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
Your work days are brighter here. At Workday, we value our candidates privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About the Team The Tax team at Workday is committed to bringing passion and customer focus in everything we do, a place where everyone is treated equitably, and communications held openly and honestly. Our employees are at the very core of our values, and we pride ourselves to provide a rewarding career path in a fun environment that makes work that much more enjoyable. This is a newly created role in our Workday India s Pune office and a fantastic opportunity to join a growing tax team. About the Role We are seeking an Associate Indirect Tax Analyst to join our dynamic tax team. This role offers an opportunity to support the effective management of Indirect Tax obligations within Workday s International Indirect Tax department. This role encompasses a variety of responsibilities, primarily focused on assisting the Indirect Tax team with compliance, reporting, and detailed analysis related to Value Added Tax (VAT) and Goods and Services Tax (GST) across multiple jurisdictions. They will support key projects and initiatives and will contribute to identifying and implementing process improvements that enhance efficiency and accuracy within the department. This position provides a unique opportunity to gain hands-on experience in a fast-paced, international environment, working alongside an experience Indirect Tax team. Responsibilities Assist with preparation of indirect tax returns in accordance with existing processes and controls Support Indirect Tax payments and obtaining refunds Assist with preparation of monthly indirect tax reconciliations of indirect tax return to General Ledger (GL) Assist with review of Accounts Payable (AP), Account Receivable (AR) and Intercompany transactions, and identify and correct any errors/exceptions Involvement in month-end process including posting journals related to Indirect Tax in Workday Financials Support with Tax Authorities queries and audits Assist with ad hoc indirect tax projects automation, process improvements, transactional improvements Staying up to date on changes to Indirect Tax legislation About You The ideal candidate will have some Indirect Tax compliance experience and keen to learn and develop Indirect Tax expertise, is a good communicator and enjoys collaborating with other teams. Basic Qualifications 3+ years experience gained in an Indirect Tax environment University degree in business, finance, accounting or similar qualification Some knowledge of Indirect Tax regulations and compliance requirements Good understanding of accounting and its implications on indirect tax Experience with international VAT/GST is a plus Proficient in MS Office Motivated self-starter with the ability to take ownership of tasks and deliver results Key Competencies High attention to detail and accuracy Ability to clearly and concisely convey complex information Ability to work independently and in a team Ability to manage varied tasks and prioritize workload Capacity to contribute to the development of a fast-growing department Continually seeks to improve work processes and find ideas for more effective working Capacity to learn new concepts and technology quickly Flexibility to work across different time zones when required Our Approach to Flexible Work With Flex Work, we re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means youll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our rolesIf so, ask your connection at Workday about our Employee Referral process!
Posted 3 days ago
1.0 - 6.0 years
7 - 10 Lacs
Ahmedabad
Work from Office
Qatar Airways is seeking experienced professionals for the role of Analyst Card Processing based at our Global Business Services (GBS) in Ahmedabad, India. This position will be primarily responsible for processing of Universal Air Travel Plan (UATP) transactions and monitoring Simplified Invoicing Settlement (IATA SIS) billing process. Also responsible for assisting the stations/end users for resolving card payment related queries and ensuring the compliance to Payment Card Industry Data Security Standard (PCIDSS). Responsibilities: Own the PCIDSS compliance requirements for the QR network and coordinate with the internal and external stakeholders for ensuring compliance with the standards. Coordinate with the stations and QR Offices for PCIDSS compliance, maintain and review relevant documents. Monitor acquirers PCIDSS compliance by reviewing and maintain the relevant documents including Attestation of Compliance (AOC) and Report on Compliance (ROC). Monitor POS/terminals compliance by reviewing and maintain relevant documents including PCIPTS and/or PADSS Compliance Certificates for service providers. Maintain and update the compliance trackers and data repository. Update and maintain PCIDSS Card Matrix Master. Coordinate and support various stakeholders for adhoc requests and audits. Assist the internal teams for the annual PCIDSS on site assessment and resolve audit queries if any. Maintain and update internal policies and SOP s relevant for card processing unit. Consolidate and validate the UATP forms and SIS files for accuracy and completeness. Prepare and post the accounting entries for the UATP accounting and reconcile the same on periodic basis. Consolidate UATP form 1 information for the stations teams and respond to station queries. Maintain the user access for the UATP tool in accordance with the internal policies and procedures. Perform all activities ensuring SLA/ KPI s are achieved, including but not limited to ensuring 100% compliance with PCIDSS compliances, on time accounting for UATP transactions, etc. Coordinate with IT for automation of existing process and assist in continuous improvement of existing processes for improving the units service delivery. Perform other department duties related to his/her position as directed by the Head of the Department Be part of an extraordinary story Your skills. Your imagination. Your ambition. Here, there are no boundaries to your potential and the impact you can make. You ll find infinite opportunities to grow and work on the biggest, most rewarding challenges that will build your skills and experience. You have the chance to be a part of our future, and build the life you want while being part of an international community. Our best is here and still to come. To us, impossible is only a challenge. Join us as we dare to achieve what s never been done before. Together, everything is possible Qualifications Qualifications Required: Bachelor s Degree or Equivalent with Minimum 1 years of job-related experience Proficient in MS Excel, pivot tables and graphs Experience of working with ERP Good command of English language Ability to communicate properly with colleagues and other internal/external parties Working knowledge of the PCIDSS and relevant card processing compliance Knowledge of working with Oracle ERP
Posted 3 days ago
8.0 - 13.0 years
13 - 17 Lacs
Bengaluru
Work from Office
About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . As a Staff Engineer on the Data Engineering Team you ll be working on some of the hardest problems in the field of Data, Cloud and Security with a mission to achieve the highest standards of customer success. You will be building blocks of technology that will define Netskope s future. You will leverage open source Technologies around OLAP, OLTP, Streaming, Big Data and ML models. You will help design, and build an end-to-end system to manage the data and infrastructure used to improve security insights for our global customer base. You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Conceiving and building services used by Netskope products to validate, transform, load and perform analytics of large amounts of data using distributed systems with cloud scale and reliability. Helping other teams architect their applications using services from the Data team wile using best practices and sound designs. Evaluating many open source technologies to find the best fit for our needs, and contributing to some of them. Working with the Application Development and Product Management teams to scale their underlying services Providing easy-to-use analytics of usage patterns, anticipating capacity issues and helping with long term planning Learning about and designing large-scale, reliable enterprise services. Working with great people in a fun, collaborative environment. Creating scalable data mining and data analytics frameworks using cutting edge tools and techniques Required skills and experience 8+ years of industry experience building highly scalable distributed Data systems Programming experience in Python, Java or Golang Excellent data structure and algorithm skills Proven good development practices like automated testing, measuring code coverage. Proven experience developing complex Data Platforms and Solutions using Technologies like Kafka, Kubernetes, MySql, Hadoop, Big Query and other open source databases Experience designing and implementing large, fault-tolerant and distributed systems around columnar data stores. Excellent written and verbal communication skills Bonus points for contributions to the open source community Education BSCS or equivalent required, MSCS or equivalent strongly preferred #LI-SK3
Posted 3 days ago
2.0 - 7.0 years
8 - 12 Lacs
Pune
Work from Office
Python Engineer, Data-Driven Web Solutions, Pune, India Office Custom Search Sort by: Sign in Your email Your password United States United Kingdom Albania (Shqip ri) American Samoa Antigua and Barbuda Austria ( sterreich) Azerbaijan (Az rbaycan) Belgium (Belgi ) Benin (B nin) Brazil (Brasil) British Indian Ocean Territory British Virgin Islands Burkina Faso Burundi (Uburundi) Cameroon (Cameroun) Cape Verde (Kabu Verdi) Caribbean Netherlands Cayman Islands Central (R publique ) Chad (Tchad) Christmas Island Cocos (Keeling) Islands Congo (DRC) (Jamhur) Cook Islands Costa Rica C te d Ivoire Croatia (Hrvatska) Cura ao Czech Republic ( esk republika) Denmark (Danmark) Dominican Republic (Rep blica) El Salvador Equatorial Guinea (Guinea Ecuatorial) Estonia (Eesti) Falkland Islands (Islas Malvinas) Faroe Islands (F royar) Finland (Suomi) French Guiana (Guyane fran aise) French Polynesia (Polyn sie fran aise) Germany (Deutschland) Ghana (Gaana) Greenland (Kalaallit Nunaat) Guinea (Guin e) Guinea-Bissau (Guin Bissau) Hong Kong ( ) Hungary (Magyarorsz g) Iceland ( sland) Isle of Man Italy (Italia) Latvia (Latvija) Lithuania (Lietuva) Macedonia (FYROM) ( ) Madagascar (Madagasikara) Marshall Islands Mauritius (Moris) Mexico (M xico) Moldova (Republica Moldova) Montenegro (Crna Gora) Mozambique (Mo ambique) Myanmar (Burma) ( ) Namibia (Namibi ) Netherlands (Nederland) New Caledonia (Nouvelle) New Zealand Niger (Nijar) Norfolk Island North Korea ( ) Northern Mariana Islands Norway (Norge) Panama (Panam ) Papua New Guinea Peru (Per ) Poland (Polska) Puerto Rico R union (La R union) Romania (Rom nia) Saint Barth lemy Saint Helena Saint Kitts and Nevis Saint Lucia Saint Martin (Saint-Martin) Saint Pierre and Miquelon (Saint-Pierre) Saint Vincent San Marino S o Tom (S o Tom ) Saudi Arabia ( ) Senegal (S n gal) Sierra Leone Sint Maarten Slovakia (Slovensko) Slovenia (Slovenija) Solomon Islands Somalia (Soomaaliya) South Africa South Korea ( ) South Sudan ( ) Spain (Espa a) Sri Lanka ( ) Svalbard and Jan Mayen Sweden (Sverige) Switzerland (Schweiz) Timor-Leste Trinidad and Tobago Turkey (T rkiye) Turks and Caicos Islands U.S. Virgin Islands United Arab Emirates ( ) Uzbekistan (O zbekiston) Vatican City (Citt del Vaticano) Vietnam (Vi t Nam) Wallis and Futuna (Wallis-et-Futuna) Western Sahara ( ) land Islands capital (uppercase) 8 characters Recaptcha requires verification. Im not a robot Sign in Please enter a registered email Id Current Job Opening Python Engineer, Data Driven Web Solutions, Pune, India Office Job Description: We are seeking an experienced Python Engineer Data-Driven Web Solutions to work on strategy, design, development and implementation of large-scale systems on the cloud. The ideal candidate will be knowledgeable of Azure Services and experienced with CI/CD pipelines, API s, Relational Database Systems. Key Responsibilities: 1) Design and develop scalable, secure, and responsive web applications. 2) Build and maintain both front-end (using modern UI frameworks) and back-end services. 3) Create RESTful APIs to ensure smooth data flow between systems. 4) Conduct end-to-end testing to validate functionality, usability, security, and performance. 5) Implement security best practices across application and infrastructure layers. 6) Develop and manage optimized SQL databases to support application scalability and performance. 7) Azure Monitor, Application Insights and Log analytics know how. 8) Use tools like Azure Resource Manager(ARM) templates. 9) Design, implement and manage CI/CD pipelines using Azure DevOps. 10) Create and maintain technical documentation. 11) Troubleshoot, debug, and perform root cause analysis for production issues. 12) Review and filter large volumes of data using tools such as Power BI, Excel, Power Query, Python and other analytical tools. Requirements/Qualifications: 1) Bachelor s degree in computer science, Information Systems, Industrial Engineering, or a related field. 2) Minimum 2 years of hands-on experience in demand planning, data analysis, and/or full-stack web application development. 3) Proficient in tools such as Power BI, Excel. 4) Strong experience with SQL databases, including writing and optimizing queries. 5) Solid understanding of web technologies: HTML, CSS, JavaScript, and frameworks such as React, Angular, or Vue.js. 6) Experience with server-side development using Python. 7) Familiarity with API development and integration. 8) Working knowledge of data security and application infrastructure best practices. 9) Excellent problem-solving skills and attention to detail. 10) Strong communication and collaboration skills with cross-functional teams. To apply for
Posted 3 days ago
1.0 - 2.0 years
11 - 13 Lacs
Bengaluru
Work from Office
Role: Data Engineer I Location: Bengaluru What you ll do We re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, so you will be too. No matter the role or the location, we re all united in the vision to lead the programmatic industry and make it better. As a Data Engineer I in our technology department, you ll have the chance to: Excellent analytical and problem-solving skills Expert in at least one of the Big Data Technologies Expert in SQL, worked on advanced SQL Software solution development using Technologies such as MapReduce, Hive, Spark, Kafka, Yarn/Mesos etc Good Development and coding skills in Python, Java and RDBMS(Sql Queries) Who are your stakeholders The platform enables Data Scientists and Analysts to ingest, process, and analyze vast amounts of data effectively, leading to optimised models and insights With API first and data as a product approach, the product also has all the other capabilities as its customer base The platform is also leveraged quite extensively for all be-spoke market needs and hence local products also are stakeholders for Data Management platform What you ll bring Bachelor s or Master s degree in Engineering 1 - 2 years overall experience in Data Warehouse development and database design Software solution development using Hadoop Technologies such as MapReduce, Hive, Spark, Kafka, Yarn/Mesos etc Expert in SQL, worked on advanced SQL for at least 2+ years or as a Data warehouse administrator Good development skills in Java, Python or other languages Experience with EMR, S3 Knowledge and exposure to BI applications, e.g. Tableau, Qlikview We ve highlighted some key skills, experience and requirements for this role. But please don t worry if you don t meet every single one. Our talent team strives to find the best people. They might see something in your background that s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. What impact will you create As a Data Engineer I you will work on Data which is at the core of all programmatic decision-making. At MiQ we have our own interoperable and unified data platform that covers all aspects of data management from ingestion to processing to cataloguing. As a Data Engineer I you will work in a platform that enables Data Scientists and Analysts to ingest, process, and analyze vast amounts of data effectively, leading to optimised models and insights. You will work to bring the best-in-class platform that enables ease of data management with the right governance practises embedded in it. With stringent data security measures and ensuring compliance with regulations, engineers protect sensitive information and maintain customer trust. You will act as a pivot along with product who with their interactions with users and stakeholders, build the platform to accommodate business growth. You will continuously aspire to build and manage Data Platform with focus on scalability, future-proofing the data systems inline with changing programmatic trends. What s in it for you Our Center of Excellence is the very heart of MiQ, and it s where the magic happens. It means everything you do and everything you create will have a huge impact across our entire global business. MiQ is incredibly proud to foster a welcoming culture. We do everything possible to make sure everyone feels valued for what they bring. With global teams committed to diversity, equity, and inclusion, we re always moving towards becoming an even better place to work. Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer
Posted 3 days ago
5.0 - 10.0 years
15 - 17 Lacs
Mumbai
Work from Office
Jul 28, 2025 Location: Mumbai Designation: Assistant Manager Strong understanding of cloud technologies and platforms: Azure/AWS/GCP/OCI Understanding of cloud security architecture Understanding of Zero trust principle, security technologies and controls: AWS/Azure/GCP/OCI cloud native security controls, Identity Access Management, Data Security, IDS/IPS, SIEM, web application firewall, cryptography, Kubernetes, container security etc. Should have conducted cloud security assessments and configuration reviews as per industry best practices Familiarity with industry-leading standards and frameworks such as ISO 27001, NIST, CSA CCM, CIS benchmarks to help clients adhere to compliance requirements Knowledge and experience of Risk Management Lifecycle (Risk Identification, Risk Assessment, Risk Response, & Reporting) Experience with cloud security tools and services Knowledge and experience in developing/creating cloud security policies and frameworks for organizations Effective written and communication skills Strong sense of ownership, urgency, and drive Demonstrate teamwork and collaborate with other teams to ensure client s cloud environment is secure
Posted 3 days ago
2.0 - 3.0 years
2 - 5 Lacs
Jaipur
Work from Office
Dreamplus colonizers and Developers private limited is looking for TeleCaller to join our dynamic team and embark on a rewarding career journey A telecaller is a customer service representative who contacts customers over the telephone A typical job description for a telecaller includes the following responsibilities: Make outbound calls to customers to promote products and services, or follow up on recent purchases Respond to customer inquiries and provide information about products and services Resolve customer complaints and provide appropriate solutions Keep records of all customer interactions and transactions, updating customer information in a database as necessary Meet and exceed sales and customer satisfaction targets Continuously improve product and service knowledge to provide accurate information to customers Stay up-to-date with industry developments and maintain a working knowledge of competitor offerings Follow all company policies and procedures, including those related to confidentiality and data security Participate in training and development opportunities to improve skills and knowledge 0 Adhere to schedules and work efficiently under pressure to meet deadlines
Posted 3 days ago
4.0 - 9.0 years
10 - 14 Lacs
Chennai
Work from Office
The ECM Consultant is responsible for designing, implementing, and managing enterprise content management solutions to optimize document handling, workflow automation, and data security for organizations. Show Job Responsibilities Responsibilities: Analyze business needs and identify ECM requirements. Design and implement ECM solutions using platforms like SharePoint, OpenText, or Alfresco. Configure workflows for document management, approval processes, and records retention. Ensure compliance with industry regulations and data security standards. Integrate ECM solutions with existing enterprise applications (ERP, CRM, etc.). Provide training and support to end users on ECM platforms. Monitor and optimize ECM system performance. Qualifications: Bachelors degree in Information Technology, Computer Science, or a related field. 4+ years of experience with ECM platforms like SharePoint, OpenText, or Documentum. Strong understanding of content lifecycle management and compliance. Knowledge of document security and workflow automation.
Posted 3 days ago
1.0 - 2.0 years
17 - 19 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Overview of Role As a Data Engineer specializing in AI/ML, youll be instrumental in designing, building, and maintaining the data infrastructure crucial for training, deploying, and serving our advanced AI and Machine Learning models. Youll work closely with Data Scientists, ML Engineers, and Cloud Architects to ensure data is accessible, reliable, and optimized for high-performance AI/ML workloads, primarily within the Google Cloud ecosystem. Responsibilities Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient ETL/ELT data pipelines to ingest, transform, and load data from various sources into data lakes and data warehouses, specifically optimized for AI/ML consumption. AI/ML Data Infrastructure: Architect and implement the underlying data infrastructure required for machine learning model training, serving, and monitoring within GCP environments. Google Cloud Ecosystem: Leverage a broad range of Google Cloud Platform (GCP) data services including, BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Vertex AI, Composer (Airflow), and Cloud SQL. Data Quality & Governance: Implement best practices for data quality, data governance, data lineage, and data security to ensure the reliability and integrity of AI/ML datasets. Performance Optimization: Optimize data pipelines and storage solutions for performance, cost-efficiency, and scalability, particularly for large-scale AI/ML data processing. Collaboration with AI/ML Teams: Work closely with Data Scientists and ML Engineers to understand their data needs, prepare datasets for model training, and assist in deploying models into production. Automation & MLOps Support: Contribute to the automation of data pipelines and support MLOps initiatives, ensuring seamless integration from data ingestion to model deployment and monitoring. Troubleshooting & Support: Troubleshoot and resolve data-related issues within the AI/ML ecosystem, ensuring data availability and pipeline health. Documentation: Create and maintain comprehensive documentation for data architectures, pipelines, and data models. Qualifications: 1-2+ years of experience in Data Engineering, with at least 2-3 years directly focused on building data pipelines for AI/ML workloads. Deep, hands-on experience with core GCP data services such as BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Composer/Airflow. Strong proficiency in at least one relevant programming language for data engineering (Python is highly preferred).SQL skills for complex data manipulation, querying, and optimization. Solid understanding of data warehousing concepts, data modeling (dimensional, 3NF), and schema design for analytical and AI/ML purposes. Proven experience designing, building, and optimizing large-scale ETL/ELT processes. Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop) and concepts. Exceptional analytical and problem-solving skills, with the ability to design solutions for complex data challenges. Excellent verbal and written communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders.
Posted 3 days ago
4.0 - 7.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities 4 7 years of experience in Data Engineering or related roles. Handson experience in Microsoft Fabric Handson experience in Azure Databricks Proficiency in PySpark for data processing and scripting. Strong command over Python & SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Hands on experience in performance tuning & optimization on Databricks & MS Fabric. Ensure alignment with overall system architecture and data flow. Understanding CI/CD practices in a data engineering context. Excellent problemsolving and communication skills. Exposure to BI tools like Power BI, Tableau, or Looker. Good to Have Experienced in Azure DevOps. Familiarity with data security and compliance in the cloud. Experience with different databases like Synapse, SQL DB, Snowflake etc. Mandatory skill sets Microsoft Fabric, Azure (Databricks & ADF), PySpark Preferred skill sets Microsoft Fabric, Azure (Databricks & ADF), PySpark Years of experience required 410 Education qualification Btech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?
Posted 3 days ago
5.0 - 10.0 years
11 - 13 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building and optimizing data pipelines and data architecture, as well as experience with Azure cloud services. You will work closely with cross-functional teams to ensure data is accessible, reliable, and ready for analytics and business insights. Mandatory Skills Advanced SQL, Python and PySpark for data engineering Azure 1st party services (ADF, Azure Databricks, Synapse, etc.) Data warehousing (Redshift, Snowflake, Big Query) Workflow orchestration tools (Airflow, Prefect, or similar) Experience with DBT (Data Build Tool) for transforming data in the warehouse Hands-on experience with real-time/live data processing frameworks such as Apache Kafka, Apache Flink, or Azure Event Hubs Key Responsibilities Design, develop, and maintain scalable and reliable data pipelines Demonstrate experience and leadership across two full project cycles using Azure Data Factory, Azure Databricks, and PySpark Collaborate with data analysts, scientists, and software engineers to understand data needs Design and build scalable data pipelines using batch and real-time streaming architectures Implement DBT models to transform, test, and document data pipelines Implement data quality checks and monitoring systems Optimize data delivery and processing across a wide range of sources and formats Ensure security and governance policies are followed in all data handling processes Evaluate and recommend tools and technologies to improve data engineering capabilitie Lead and mentor junior data engineers as needed Work with cross-functional teams in a dynamic and fast-paced environment Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or related field Certifications in Databricks Professional are preferred Technical Skills Programming: Python, PySpark, SQL ETL tools and orchestration (e.g., Airflow, DBT), Cloud platforms (Azure) Real-time streaming tools: Kafka, Flink, Spark Streaming, Azure Event Hubs Data Warehousing: Snowflake, Big Query, Redshift Cloud: Azure (ADF, Azure Databricks) Orchestration: Apache Airflow, Prefect, Luigi Databases: PostgreSQL, MySQL, NoSQL (MongoDB, Cassandra) Tools: Git, Docker, Kubernetes (basic), CI/CD Soft Skills Strong problem-solving and analytical thinking Excellent verbal and written communication Ability to manage multiple tasks and deadlines Collaborative mindset with a proactive attitude Strong analytical skills related to working with unstructured datasets Good to Have Experience with real-time data processing (Kafka, Flink) Knowledge of data governance and privacy regulations (GDPR, HIPAA) Familiarity with ML model data pipeline integration Work Experience Minimum 5 years of relevant experience in data engineering roles Experience with Azure 1st party services across at least two full project lifecycles Compensation & Benefits Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Optional retirement savings plans and tax savings plans. Key Result Areas (KRAs) Timely development and delivery of high-quality data pipelines Implementation of scalable data architectures Collaboration with cross-functional teams for data initiatives Compliance with data security and governance standards Key Performance Indicators (KPIs) Uptime and performance of data pipelines Reduction in data processing time Number of critical bugs post-deployment Stakeholder satisfaction scores Successful data integrations and migrations
Posted 3 days ago
10.0 - 15.0 years
30 - 35 Lacs
Gurugram
Work from Office
The Enterprise Data Architect will enhancethe company's strategic use of data by designing, developing, and implementingdata models for enterprise applications and systems at conceptual, logical,business area, and application layers. This role advocates data modelingmethodologies and best practices. We seek a skilled Data Architect with deepknowledge of data architecture principles, extensive data modeling experience,and the ability to create scalable data solutions. Responsibilities includedeveloping and maintaining enterprise data architecture, ensuring dataintegrity, interoperability, security, and availability, with a focus onongoing digital transformation projects. Key Responsibilities 1. Strategy Planning o Develop and deliver long-term strategic goalsfor data architecture vision and standards in conjunction with data users,department managers, clients, and other key stakeholders. o Create short-term tactical solutions toachieve long-term objectives and an overall data management roadmap. o Establish processes for governing theidentification, collection, and use of corporate metadata; take steps to assuremetadata accuracy and validity. o Establish methods and procedures for trackingdata quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle,duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security,backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architecturesare aligned with regulatory compliance. o Develop a comprehensive data strategy incollaboration with different stakeholders that aligns with the transformationalprojects goals. o Ensure effective data management throughoutthe project lifecycle. 2. Acquisition Deployment o Ensure the success of enterprise-levelapplication rollouts (e.g. ERP, CRM, HCM, FPA, etc.) Liaise with vendorsand service providers to select the products or services that best meet company goals 3. Operational Management Assess and determine governance, stewardship, and frameworks for managing data across the organization. Develop and promote data management methodologies and standards. Document information products from business processes and create data entities Create entity relationship diagrams to show the digital thread across the value streams and enterprise Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data Address the regulatory compliance requirements of each country and ensure our data is secure and compliant Select and implement the appropriate tools, software, applications, and systems to support data technology goals. Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. Collaborate with project managers and business unit leaders for all projects involving enterprise data. Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. Act as a leader and advocate of data management, including coaching, training, and career development to staff. Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. Document the data architecture and environment to maintain a current and accurate view of the larger data picture. Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: Establish and enforce data governance policies and procedures as agreed with stakeholders. Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: Oversee the data migration process from legacy systems to the new systems being put in place. Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. Ensure the enterprise system meets the organization's data needs. 9. Training and Support: Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. Promote user adoption and proper use of data. 10. Data Quality Assurance: Implement data quality assurance measures to identify and correct data issues. Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems Enable data-driven decision-making through robust data analysis. 1.Continuous Improvement: Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) SQL: Strong SQL skills for querying and managing databases Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Education: Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral A self-starter, an excellent planner and executor and above all, a good team player Excellent communication skills and inter-personal skills are a must Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines Ability to build collaborative relationships and effectively leverage networks to mobilize resources Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements
Posted 3 days ago
15.0 - 18.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Acuitys Data and Technology Services group is seeking a Salesforce Solution Architect with deep experience in Financial Services Cloud (FSC) and a strong understanding of private equity client and fund lifecycle processes to lead the design and delivery of Salesforce-based platforms supporting capital operations, investor onboarding, relationship management, and compliance enablement across the private equity ecosystem. Required Qualifications Salesforce Certified Application Architect and System Architect (mandatory) Salesforce Financial Services Cloud Consultant (preferred) 15+ years of Salesforce experience with at least 5+ years in an FSC solution architect role Demonstrated success in private equity or asset management environments, with exposure to GPCA, primaries, secondaries, and investor workflows Strong command of Salesforce sharing models, data security, multi-currency configuration, and compliance frameworks Preferred Skills Experience with tools such as DocuSign, Conga, MuleSoft, Snowflake, Tableau CRM, and integration with fund administration tools Exposure to third-party investor portals and AppExchange products specific to Private Equity Strong interpersonal and stakeholder management skills, able to communicate with C-level sponsors and IT leadership Key responsibilities include: Salesforce FSC Solution Leadership Lead the design and configuration of core FSC features, including: o Relationship Groups and Householding for LP structures o Lead and Referral Management for GP/LP acquisition and qualification o Financial Accounts and Holdings to track capital commitments and distributions o Opportunity Qualification across various private equity segments o Life Events and Business Milestones to trigger investor servicing workflows o Interaction Summaries and Activity Capture for visibility into relationship touchpoints o Onboarding and KYC workflows aligned with compliance requirements Align platform capabilities to the full lifecycle of capital raising onboarding, deal servicing, and distribution management. Architectural Ownership Own the end-to-end Salesforce architecture across multiple clouds and business units, ensuring scalability, compliance, and extensibility. Collaborate with technical architects and global development teams to define data models, integration touchpoints, and component design strategies. Ensure alignment with Salesforce platform limits, enterprise standards, and future expansion plans (e.g., Service Cloud, Data Cloud, Einstein). Private Equity Process Enablement Translate business processes such as GP onboarding, primaries, secondaries, capital calls, and distribution tracking into CRM workflows. Enable compliance tracking through FSCs built-in capabilities, in conjunction with external KYC/AML tools and document management systems. Design role-based access for Investor Relations, Legal, Compliance, and Client Services teams. Global Delivery Collaboration Operate effectively in a global delivery model, engaging with distributed stakeholders, developers, testers, and platform teams. Drive architectural governance, solution consistency, and cross-team collaboration across time zones and workstreams. Support the onboarding of additional regions, funds, or legal entities through scalable org strategies and rollout frameworks. Future-Ready Architecture Guide optional capabilities using Tableau CRM, Salesforce Data Cloud, and Einstein AI for LP segmentation, engagement scoring, and capital forecasting. Support integration strategies with third-party tools such as fund admin systems, document generation, and identity verification platforms.
Posted 3 days ago
7.0 - 10.0 years
9 - 12 Lacs
Pune
Work from Office
about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Information Security Project Specialist ZSs India Capability & Expertise Center (CEC) houses more than 60% of ZS people across three offices in New Delhi, Pune and Bengaluru. Our teams work with colleagues across North America, Europe and East Asia to create and deliver real world solutions to the clients who drive our business. The CEC maintains standards of analytical, operational and technological excellence across our capability groups. Together, our collective knowledge enables each ZS team to deliver superior results to our clients. What Youll Do: Executes the end-to-end management of security projects: including resource management, communications, training requirements, change management and budget (if applicable). Estimate the resources and participants needed to achieve project goals. Reviews and recommends changes, reductions or additions to the overall project Acts as the liaison between InfoSec and end-users when applicable Maintains the efficiency of the project management process such as planning, scheduling, and budget and risk assessment. Identifies and mitigates potential risks Work with cross-functional teams and staff of all levels, including assisting in the development, training and assignment of work/projects to team members reporting to others; Works well within a structured environment in which team members can work together as an efficient team. What Youll Bring: Bachelors Degree required. 7 - 10 years of relevant work experience, including Information Security, project management (5+ years), and team management. PMP-PMI certification desired, or completion within a year of assuming the position. Agile certification desired, or completion within a year of assuming the position. Security+ or equivalent certification desired, or completion within a year of assuming the position. (CISM- Certified Information Security Manager, CompTIA Security+, Etc ) Project plan development experience, including charter, scope, project management approach, management plans, statement of work, cost estimates, schedule. Excellent communication (written and oral) and interpersonal skills; ability to interface and influence all levels within the organization, including facilitation, consulting, negotiation, and presentation. Excellent project management and coordination skills working with multiple stakeholders across several technology platforms and business areas Strong technical skills and experience. The ideal candidate has lead projects relating to Information Security deliveries or migrations (Vulnerability Management, Identity and access management, Cloud Strategy & Governance, Data Security, Enterprise Risk Management, Asset Management, Security awareness & training) Project plan and budget management. Knowledge of project management best practices, Experience identifying and mediating risk.
Posted 3 days ago
8.0 - 10.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Role Overview: As a Agile Program Manager for a Skyhigh Product Group, you will act as a central coordination point that can provide a consolidated and holistic view of program delivery progress across a Product Group. You will ensure clear and transparent communication with all stakeholders and your passion and enthusiasm for organization and attention to detail will enable you to fully execute the planning, facilitation and communication of complex deliveries. You will bring best practice Agile delivery improvements to the team, embedding a data driven approach to driving a continuous improvement culture. Role Overview As an Agile Program Manager for a Skyhigh Product Group, you will act as a central coordination point that can provide a consolidated and holistic view of program delivery progress across a Product Group. You will ensure clear and transparent communication with all stakeholders and your passion and enthusiasm for organization and attention to detail will enable you to fully execute the planning, facilitation and communication of complex deliveries. You will bring best practice Agile delivery improvements to the team, embedding a data driven approach to driving a continuous improvement culture. In this role: The Agile Program Manager is part of a team of program managers that operate across the various product groups that together make up Skyhigh Securitys portfolio of products. Role details: Program Leadership: Work with Senior leadership to ensure that the Product Domain and program goals are aligned with the company's strategic vision Lead the end-to-end planning, driving accountability in teams towards delivery of major initiatives within the product domain Define the program milestones and success criteria in alignment with OKRs Plan, facilitate & communicate across product domains to provide a holistic, consolidated Product Group delivery with transparent progress information at the portfolio level. This includes: Proactively identifying and managing major dependencies related to departments outside of engineering, particularly in relation to New Product Introduction items. Collaborating with teams across product management, engineering, design, marketing, sales and customer success to ensure alignment and seamless delivery execution. Owning and delivering all reporting, including to executive stakeholders on program progress, RAID and milestones. Fostering a clear and effective communication approach so all Product Group portfolio information is readily available Coordinating annual & quarterly portfolio planning Proactively identify, assess and mitigate Product Group-level risks Deliver & execute all initiative tracking, including workforce allocation against business defined goals and budget guardrails, and value tracking for limited availability releases and recent GA release. You will also: Ensure Jira can deliver consistent portfolio-level reports, while enforcing adherence within the teams for the collection of core data Identify key dependencies across the product group and the wider portfolio,, ensuring these are picked up and owned by the appropriate Engineering Manager. Seek out continuous improvement by working alongside other Program Managers to drive a common approach to portfolio management for process, tools & people. Youll establish portfolio execution KPIs at the Product Group Level, while seeking out ways to drive improvement initiatives to improve those KPIs. Provide coaching and development to the teams related to agile delivery best practices. General Background and Experience required for a Program Manager: 8-10+ years of agile program management experience Engineering Product Domains At least 3+ years managing complex Engineering initiatives for a Product Group, which comprises multiple product domains. Experience working with distributed Engineering teams across time zones, in a global organization. Extensive expertise of agile program management discipline and methodologies. Demonstrated ability to facilitate, lead, organize and motivate matrix teams while working across team dependencies to achieve Program results within defined project milestones and identified timelines. Excellent time management, communication (written, verbal), and organization skills across multiple levels and functional areas, with a strong ability to cohesively synthesize data and key points for both internal and executive consumption. Excellent knowledge of change management methodology. Tools: Proficiency in Agile Program Management tools e.g. Jira, Confluence It would be great if you also have the following, but they are not required : PMP certification Agile Certification.
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You should have at least 5+ years of experience in HCM Release Management for Oracle HCM components such as EXTRACT, ALERT, BIP, and Reports. You must be proficient in utilizing Oracle CSM & FSM tools for automated Release Management. Additionally, prior experience in managing Oracle POD, instance strategy, P2T, T2T, and Data Masking is required. It is essential that you have experience in Certificate Management for Real-time Integrations and automating user, role, and Area of responsibilities creation in Oracle HCM. Experience with LBAC features implementation and integrating with SSO tools like OKTA is highly beneficial. You should have hands-on experience in configuring Oracle HCM role-based security across functional areas, including HCM and Recruiting. Proficiency in Oracle Cloud HCM Security setup and modifications related to roles, permissions, and data security is necessary. This includes building custom roles based on the delivered roles provided in the Oracle product. Nice to have skills include prior experience in implementing Continuous Integration & Continuous Deployment with Oracle HCM & Oracle Integration Cloud, as well as automating user & AOR assignment with enterprise systems.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
thane, maharashtra
On-site
You will be working as a Full Stack Developer at NIRMAL Industrial Controls Pvt Ltd, a leading provider of solutions for Natural Gas consumption based in Thane, Mumbai. The company specializes in the design, manufacturing, and commissioning of natural gas conditioning, regulating, and metering skids. Your responsibilities will include: - Utilizing strong hands-on development experience in PHP (Laravel, Symfony, or CodeIgniter), Python (Django or Flask), and Node.js (Express, Nest.js). - Demonstrating proficiency in front-end technologies such as HTML5, CSS3, JavaScript, and modern JS frameworks like React/Vue. - Working with relational databases like MySQL or PostgreSQL. - Understanding REST APIs, microservices architecture, and asynchronous processing. - Familiarity with Docker, Git, CI/CD pipelines, and cloud deployment (AWS, Azure). - Exposure to manufacturing systems (ERP, MES, SCADA, IoT platforms) is a strong plus. - Knowledge of message queues (Kafka, RabbitMQ) or real-time data processing is a bonus. Your main tasks will involve: - Designing, developing, and maintaining web applications using front-end and back-end technologies. - Building and consuming RESTful APIs to support integration with ERP, MES, IoT, and third-party platforms. - Ensuring technical feasibility of UI/UX designs and optimizing applications for maximum speed and scalability. - Developing intuitive front-end interfaces using JavaScript, React.js, Vue.js, or similar frameworks. - Collaborating closely with manufacturing, production, and QA teams to automate workflows and data reporting. - Integrating with shop-floor systems and devices for real-time data visibility and analysis. - Writing clean, maintainable, and efficient code following best practices and coding standards. - Participating in code reviews, system design discussions, and technical planning. - Debugging and resolving software defects and performance issues. - Evaluating and adopting new platforms and tools relevant to Industry 4.0 and smart manufacturing. - Ensuring data security and implementing authentication and authorization mechanisms.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Engineer at our company, you will be responsible for designing and implementing Azure Synapse Analytics solutions for data processing and reporting. Your role will involve optimizing ETL pipelines, SQL pools, and Synapse Spark workloads to ensure efficient data processing. It will also be crucial for you to uphold data quality, security, and governance best practices while collaborating with business stakeholders to develop data-driven solutions. Additionally, part of your responsibilities will include mentoring a team of data engineers. To excel in this role, you should have 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Your expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes will be essential. Experience with Fabric is strongly desirable, and possessing strong leadership, problem-solving, and stakeholder management skills is crucial. Knowledge of Power BI, Python, or Spark would be a plus. You should also have deep knowledge of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and proficiency in writing complex SQL queries. Furthermore, you are expected to have knowledge and experience in Master Data/metadata management, including Data Governance, Data Quality, Data Catalog, and Data Security. Your ability to manage a complex and rapidly evolving business, actively lead, develop, and support team members will be key. As an Agile practitioner and advocate, you must be highly dynamic in your approach, adapting to constant changes in risks and forecasts. Your role will involve ensuring data integrity within the dimensional model by validating data and identifying inconsistencies. You will also work closely with Product Owners and data engineers to translate business needs into effective dimensional models. This position offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and receive competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. Ideally, you should hold a Bachelors/masters degree in software engineering, Computer Science, or a related area. Our company offers a range of benefits, including hybrid working arrangements, an annual performance-related bonus, Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture. MRI Software is dedicated to delivering innovative applications and hosted solutions that empower real estate companies to elevate their business. With a strong focus on meeting the unique needs of real estate businesses globally, we have grown to include offices across various countries with over 4000 team members supporting our clients. MRI is proud to be an Equal Employment Opportunity employer.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Database Engineer based in Coimbatore with over 5 years of experience, you will be responsible for managing and administering databases efficiently. Your qualifications include a Bachelor's degree in Computer Science, Information Technology, or a related field. Your role will involve proficiency in various relational database management systems such as MySQL, PostgreSQL, and MongoDB. You should possess strong skills in SQL query writing and optimization. Routine maintenance tasks, software updates, backups, and database security measures will be part of your responsibilities. Ensuring compliance with data privacy regulations like GDPR and HIPAA, along with implementing strategies for scaling databases as data volume and traffic increase, will be crucial aspects of your job. You will also need to evaluate and implement scalability techniques like sharding and partitioning when necessary. Moreover, setting up monitoring tools, configuring alerting systems, and having knowledge of data security best practices and regulatory compliance are essential requirements for this role. Your problem-solving, troubleshooting, communication, and collaboration skills will be put to the test in this challenging position. Additionally, experience with scripting, automation for database tasks, and familiarity with monitoring and management tools for DBMS will be advantageous for you.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Analytics Manager, you will be responsible for leading and managing data analytics deliverables to ensure successful completion within defined timelines and budget. You will collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Your role will involve designing, developing, and maintaining Qlik, Power BI, and Alteryx applications and dashboards. In this position, you will be tasked with maintaining and supporting the Power BI environment, including access configuration and building authorization rules. Providing technical leadership and guidance to the data analytics team is a crucial aspect of your responsibilities to ensure the delivery of high-quality solutions. You will also play a key role in the migration of Qlik applications to Power BI, ensuring a smooth transition with minimal disruption to business operations. This will involve understanding Qlik report requirements and translating them into equivalent Power BI reports, as well as proposing alternative solutions or workarounds for the migration process. Regular reviews and assessments of existing data analytics processes will be part of your duties, where you will identify areas for improvement and implement enhancements. Staying up-to-date with the latest trends and advancements in data analytics tools and technologies will be essential to provide recommendations for their adoption. Collaborating with stakeholders to understand their reporting and analytical needs and developing solutions to address them will be a key aspect of your role. Additionally, you will be responsible for training and mentoring team members on Qlik, Power BI, and Alteryx tools and best practices. Ensuring compliance with data security and privacy regulations when handling sensitive information is paramount. You should also be familiar with SDLC processes and be able to create/update artifacts such as functional and non-functional specifications, technical design documents, test plans, test cases, release procedures, system operational documents, and user manuals. Your role will also involve providing ad-hoc support for other IT service requests, such as data extraction, data alteration, extract system logic, and responding to user inquiries about data/logic in the system.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a talented and detail-oriented Business Intelligence (BI) Developer with a focus on developing and creating visually appealing dashboards. Your role is crucial in translating complex data sets into user-friendly visualizations to highlight key insights and trends. You will design, develop, and maintain interactive dashboards using Power BI, working with large datasets to extract, clean, and transform data for consumption. Collaborating with business stakeholders, you will understand their data needs and translate them into dashboard requirements. Regular feedback sessions with end-users will ensure that the dashboards meet business needs effectively. Your responsibilities also include optimizing dashboards for performance and usability, updating them regularly to reflect the latest data and business metrics. To qualify for this role, you should have at least 5 years of experience as a BI Developer, focusing on dashboard development. Proficiency in Power BI, strong SQL skills, and experience with database management systems are essential. You should possess excellent data visualization skills, experience with ETL processes and tools, and familiarity with data warehousing concepts and cloud platforms. Knowledge of programming languages like Python or R, understanding of data governance and security best practices, and the ability to translate business requirements into effective dashboards are also required. Strong analytical, problem-solving, communication, and collaboration skills are crucial for success in this role. A Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field is necessary for consideration.,
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough