Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. Empowering contact center stakeholders with real-time insights, our tech facilitates data-driven decision-making for contact centers, enhancing service levels and agent performance. As a vital team member, your work will be cutting-edge technologies and will play a high-impact role in shaping the future of AI-driven enterprise applications. You will directly work with people who've worked at Amazon, Facebook, Google, and other technology companies in the world. With Level AI, you will get to have fun, learn new things, and grow along with us. Ready to redefine possibilities? Join us! We'll love to explore more about you if you have Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1 engineering institutes with relevant work experience with a top technology company in computer science or mathematics-related fields with 3-5 years of experience in machine learning and NLP Knowledge and practical experience in solving NLP problems in areas such as text classification, entity tagging, information retrieval, question-answering, natural language generation, clustering, etc 3+ years of experience working with LLMs in large-scale environments. Expert knowledge of machine learning concepts and methods, especially those related to NLP, Generative AI, and working with LLMs Knowledge and hands-on experience with Transformer-based Language Models like BERT, DeBERTa, Flan-T5, Mistral, Llama, etc Deep familiarity with internals of at least a few Machine Learning algorithms and concepts Experience with Deep Learning frameworks like Pytorch and common machine learning libraries like scikit-learn, numpy, pandas, NLTK, etc Experience with ML model deployments using REST API, Docker, Kubernetes, etc Knowledge of cloud platforms (AWS/Azure/GCP) and their machine learning services is desirable Knowledge of basic data structures and algorithms Knowledge of real-time streaming tools/architectures like Kafka, Pub/Sub is a plus Your role at Level AI includes but is not limited to Big picture: Understand customers’ needs, innovate and use cutting edge Deep Learning techniques to build data-driven solutions Work on NLP problems across areas such as text classification, entity extraction, summarization, generative AI, and others Collaborate with cross-functional teams to integrate/upgrade AI solutions into the company’s products and services Optimize existing deep learning models for performance, scalability, and efficiency Build, deploy, and own scalable production NLP pipelines Build post-deployment monitoring and continual learning capabilities. Propose suitable evaluation metrics and establish benchmarks Keep abreast with SOTA techniques in your area and exchange knowledge with colleagues Desire to learn, implement and work with latest emerging model architectures, training and inference techniques, data curation pipelines, etc To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/ Show more Show less
Posted 1 day ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency. Role Overview As a Data Engineering Lead, you will be responsible for overseeing and guiding the data engineering team in developing, optimizing, and maintaining our data infrastructure. You will play a critical role in ensuring the seamless integration and flow of data across the organization, enabling data-driven decision-making and analytics. Key Responsibilities Data Integration: Coordinate with various teams to ensure seamless data integration across the organization's systems. ETL Processes: Develop and implement efficient data transformation and ETL (Extract, Transform, Load) processes. Performance Optimization: Optimize data flow and system performance for enhanced functionality and efficiency. Data Security: Ensure adherence to data security protocols and compliance standards to protect sensitive information. Infrastructure Management: Oversee the development and maintenance of the data infrastructure, ensuring scalability and reliability. Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven initiatives. Innovation: Stay updated with the latest trends and technologies in data engineering and implement best practices. Qualifications Experience: Proven experience in data engineering, with a strong background in leading and managing teams. Technical Skills: Proficiency in programming languages such as Python, Java, and SQL, along with experience in big data technologies like Hadoop, Spark, and Kafka. Data Management: In-depth understanding of data warehousing, data modeling, and database management systems. Analytical Skills: Strong analytical and problem-solving skills with the ability to handle complex data challenges. Communication: Excellent communication and interpersonal skills, capable of working effectively with cross-functional teams. Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Why Join Us? Work on cutting-edge data projects and contribute to the organization's data strategy. Collaborative and innovative work environment that values creativity and continuous learning. If you are a strategic thinker with a passion for data engineering and leadership, we would love to hear from you. Apply now to join our team and make a significant impact on our data-driven journey. Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Company Our client is a trusted global innovator of IT and business services. We help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients’ long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe Job Title: Jr BODS Developer Location: Hyderabad Experience: 5+ yrs Job Type : Contract to hire Notice Period:- Immediate joiner Jr SAP BODS Developer Position Overview Understand and execute data migration blueprints (migration concepts, transformation rules, mappings, selection criteria) Understand and contribute to the documentation of the data mapping specifications, conversion rules, technical design specifications as required Build the conversion processes and associated programs that will migrate the data per the design and conversion rules that have been signed-off by the client Execution of all data migration technical steps (extract, transform & load) as well as Defect Management and Issue Resolution Perform data load activities for each mock load, cutover simulation and production deployment identified in L1 plan into environments identified Provide technical support, defect management, and issue resolution during all testing cycles, including Mock Data Load cycles Complete all necessary data migration documentation necessary to support system validation / compliance requirements Support the development of unit and end-to-end data migration test plans and test scripts (including testing for data extraction, transformation, data loading, and data validation) Job Requirements 2-4 Yrs. of overall technical experience in SAP BODS with all the SAP BODS application modules (Extract, Transform, Load) 1-2 Yrs. of experience with Data Migration experience with S/4 HANA/ECC Implementations Experience in BODS Designer Components- Projects, Jobs, Workflow, Data Flow, Scripts, Data Stores and Formats Experience in BODS performance tuning techniques using parallel processing (Degree of Parallelism), Multithreading, Partitioning, and Database Throughputs to improve job performance Experience in ETL using SAP BODS and SAP IS with respect to SAP Master / Transaction Data Objects in SAP FICO, SAP SD, SAP MM/WM, SAP Plant Maintenanc Qualifications Bachelor's degree in Computer Science (or related field) Show more Show less
Posted 1 day ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description: PhysicsWallah is an Indian online education technology startup based in Delhi, created as a YouTube channel in 2014 by Mr. Alakh Pandey. We are the first company aiming to build an affordable online education platform for each Indian student who dreams of IIT & AIIMS but is unable to afford the existing offline/online education providers. About the Role: Script Writers & Video Checker's Role Summary: To conceptualize and develop high-retention Hinglish scripts for educational video content, blending storytelling, entertainment, and learning to create engaging short-form content for a wide audience. Responsibilities: Develop clear, concise, and well-structured scripts in Hinglish (Hindi-English blend) Work on pre-finalized video titles in coordination with content creators and video content managers. Utilize a variety of content formats including: Storytelling Real-life analogies Question & Answer Listicles Explainer videos Design scripts with a high-retention narrative structure: Hook: Strong, attention-grabbing intro within the first 20–30 seconds Body: Logically progressing, value-rich mid-section Closure: Clear takeaway or actionable ending (Call-to-Action) Localization & Tone Adherence Use simple, everyday Hindi vocabulary and conversational phrasing to appeal to Tier 2/3 urban and semi-urban audiences. Maintain a light, humorous, and engaging tone throughout the script to maximize relatability. Avoid overuse of technical jargon unless contextually required—ensure clarity and ease of understanding. Integrate regional idioms, cultural references, and lifestyle analogies to make complex topics feel personal and local. Subject Translation & Simplification Convert expert-level or technical content into simplified, entertaining, and actionable formats. Participate in expert briefing calls and ask relevant, strategic questions to extract audience-friendly insights. Distill complex ideas into relatable everyday narratives with emotional or humorous flavor wherever suitable. Trend Integration & Content Innovation Stay up to date with current internet culture, social media trends, viral formats, and seasonal themes. Integrate trending memes, pop-culture references, or real-world hooks where appropriate—while keeping content evergreen. Experiment with new script structures and storytelling approaches based on evolving audience preferences and short-form content platforms (e.g., YouTube Shorts, Instagram Reels). Revision Management & Collaboration Respond to feedback from video content managers, editors, and quality teams with openness and timeliness. Proactively revise scripts based on clarity, tone, retention needs, or data-backed viewer performance insights. Collaborate with video editors and designers to ensure visual elements are aligned with the script narrative. Provide briefing sessions to content creators or on-screen talent after script approval to ensure smooth delivery during shoots. Output Planning, Idea Pipeline & Consistency Maintain a consistent output of high-quality scripts as per team-set weekly/monthly quotas. Balance volume with creativity—ensure delivery timelines are met without compromising storytelling quality. Create and maintain a categorized idea repository (topic-wise) for future content ideation and faster turnaround. Qualifications: Education details Required Skills: Strong project and stakeholder management skills Excellent communication and team coordination abilities Familiarity with educational video production workflows Proficiency in content/project management tools (e.g., Trello, Notion, Google Sheets) Analytical mindset with the ability to interpret performance data and derive actionable insights Preferred Skills: Prior work in edutainment, infotainment, or explainer content. Experience in content strategy, meme writing, or performance-based copywriting. Basic understanding of SEO for video titles and thumbnails. Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services. - Strong understanding of data integration and transformation processes. - Experience with ETL (Extract, Transform, Load) methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and resolve application issues effectively. Additional Information: - The candidate should have minimum 3 years of experience in SAP BusinessObjects Data Services. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
India
Remote
Req ID: 325834 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sustainable IT Technical Product Leader to join our team in Fully Remote, Karnātaka (IN-KA), India (IN). Sustainable IT Technical Product Leader IT Sustainability CoE The IT Sustainability CoE oversees the delivery of initiatives that translate the overall NTT business sustainability strategy and policies into comprehensive IT strategy, standards, policies, and solutions to minimize environmental impact and promote responsible technology practice. IT Sustainability will be focused on driving a culture of Sustainability in IT, Sustainable Sourcing for IT, and supporting our business partners in advancing their sustainability goals and objectives with Sustainability by IT. We are seeking a visionary Sustainable IT Technical Product Leader to drive our organization's commitment to environmentally responsible technology solutions as well as to manage the implementation and ongoing roadmap and support of the Corporate Environmental, Social and Governance (ESG) platform. This role combines technical expertise with sustainability knowledge to develop and manage eco-friendly IT products and services. As the Sustainable IT Technical Product Leader, you will play a crucial role in shaping our technology footprint and supporting the Sustainability products that are necessary to support the business. This role is responsible to ensure optimal performance across all Sustainability Technology Programs with a focus on those supporting improvements in both Sustainability Sales Growth and Sustainability Site Operations. Your work will directly contribute to reducing our carbon footprint, optimizing resource usage, and positioning our company as a leader in sustainable IT practices while ensuring alignment with business goals and regulatory requirements. With the combination of the technical product management skills of a TPM with a strong focus on sustainability, the Sustainable IT Technical Product Leader role is required to meet the growing demand for environmentally responsible technology solutions in today's market. Key Responsibilities Product Strategy, Development and Implementation Develop and execute a product roadmap that aligns sustainability goals and business objectives Key contributor to all parts of sustainability technology product development: discovery and planning, requirements gathering, technical design and build, testing, and deployment. Contributes towards end-to-end view of all product development and can capably discuss requirements and developments needed with various platform owners Translate complex sustainability requirements into actionable technical solutions Collaborate with engineering teams to ensure product architecture and design meet environmental standards Determine appropriate support models to ensure any new products/technologies will meet functional and nonfunctional requirements Maintain governance controls across the Sustainability products lifecycle by supporting the process of verifying that governance deliverables and procedures are followed across the technology program. Sustainability Initiatives Lead end-to-end sustainability initiatives for IT operations and infrastructure Conduct market research to identify emerging trends in sustainable technology Define and track key performance indicators (KPIs) for product sustainability Cross-functional Collaboration Work closely with business units, engineering teams, IT teams, IT architecture teams and sustainability experts Communicate technical concepts to non-technical stakeholders effectively Mentor IT team members on sustainable IT practices Innovation and Continuous Improvement Stay updated on industry developments in sustainable technology Identify opportunities for reducing the environmental impact of IT products and services Implement and optimize lifecycle management processes for sustainable products Primary Objectives The role of a Sustainable IT Technical Product Leader differs from a traditional Technical Product Manager (TPM) in several key aspects: Sustainability Focus : While a traditional TPM primarily focuses on technical implementation and product development, a Sustainable IT Technical Product Leader places a strong emphasis on environmental sustainability in all aspects of the product lifecycle. Environmental Impact Assessment : This role requires a deep understanding of environmental standards and the ability to assess and minimize the ecological footprint of IT products and services, which is not typically a primary concern for traditional TPMs. Sustainability KPIs : Unlike traditional TPMs who mainly track technical and business KPIs, a Sustainable IT Technical Product Leader will also define and monitor key performance indicators specifically related to product sustainability. Cross-functional Collaboration : While TPMs collaborate with engineering teams, the Sustainable IT Technical Product Leader must also work with sustainability experts and business units to align technical solutions with environmental goals. Lifecycle Management: This role places a greater emphasis on implementing and optimizing lifecycle management processes for sustainable products, which may not be a primary focus for traditional TPMs. Market Research: The Sustainable IT Technical Product Leader conducts specialized market research to identify emerging trends in sustainable technology, going beyond the typical market analysis performed by TPMs. Innovation in Sustainability : This role requires a unique blend of technical expertise and sustainability knowledge to drive innovation in environmentally responsible technology solutions, which is not typically expected from traditional TPMs. Skills and Qualifications: By possessing these qualifications and skills, a Sustainable IT Technical Product Leader can effectively align, implement and support sustainable IT strategies and technology while aligning to the architectural standards of the organization, and prioritizing environmental responsibility and sustainable practices into the IT organization. Looking for at least 5 years technical leaders’ roles and 1-2 year of sustainable IT experience. Proven experience as a Technical Product Manager or similar role in the IT industry Strong understanding of software development processes and sustainable technologies Excellent communication and stakeholder management skills Proficiency in Agile/Scrum methodologies Demonstrated ability to balance technical feasibility with sustainability goals Passion for environmental sustainability and its tech-driven possibilities Hard Skills Digital literacy and tech proficiency: Familiarity with relevant software, applications, and digital tools specific to the job role. Business literacy related to ESG: Familarity with sustainabilty language, frameworks, and Sustainable IT impacts Data analysis and interpretation: Ability to extract insights from data and use them for decision-making. Technical proficiencies: Specific software, tools, or programming languages required for the position, solid understanding of AI Soft Skills Communication skills: Ability to convey ideas clearly and effectively. Leadership skills: Capacity to guide and motivate others. Teamwork skills: Ability to collaborate effectively with colleagues. Adaptability and flexibility: Willingness to embrace change and navigate uncertainties. Critical thinking and problem-solving: Analyzing complex situations and developing innovative solutions. Emotional intelligence: Self-awareness, empathy, and strong interpersonal skills. Self-management: Time management, organization, and self-motivation. Continuous learning: Commitment to upskilling and staying current with industry trends. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
India
Remote
Imaging and capture solutions for Document processing 8+ Years of relevant experience in ECM-Imaging and Capture products, solutions for Document processing Strong hands-on experience with OpenText Captiva (now Intelligent Capture) Good understanding of Imaging and capture concepts – Scan, Classify, Index, extract etc. Experience in executing successful implementations of document processing solutions in an enterprise environment Strong Knowledge with .NET and scripting Experience in integrating with REST API and applications like ECM repositories. Experience on other ECM tools like OpenText Documentum (added advantage) Experience in Agile Delivery, ability to manage team and adhere to the delivery plan. Strong interpersonal skills & ability to manage customer and internal stake holders Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Mohali, Punjab
On-site
Chicmic Studios Job Role: Data Scientist Experience Required: 3+ Years Skills Required: Data Science, Python, Pandas, Matplotlibs Job Description: We are seeking a Data Scientist with strong expertise in data analysis, machine learning, and visualization. The ideal candidate should be proficient in Python, Pandas, and Matplotlib, with experience in building and optimizing data-driven models. Some experience in Natural Language Processing (NLP) and Named Entity Recognition (NER) models would be a plus. Roles & Duties: Analyze and process large datasets using Python and Pandas. Develop and optimize machine learning models for predictive analytics. Create data visualizations using Matplotlib and Seaborn to support decision-making. Perform data cleaning, feature engineering, and statistical analysis. Work with structured and unstructured data to extract meaningful insights. Implement and fine-tune NER models for specific use cases (if required). Collaborate with cross-functional teams to drive data-driven solutions Required Skills & Qualifications: Strong proficiency in Python and data science libraries (Pandas, NumPy, Scikit-learn, etc.). Experience in data analysis, statistical modeling, and machine learning. Hands-on expertise in data visualization using Matplotlib and Seaborn. Understanding of SQL and database querying. Familiarity with NLP techniques and NER models is a plus. Strong problem-solving and analytical skills. Contact: 9875952836 Office Address: F273, Phase 8B industrial Area, Mohali, Punjab. Job Type: Full-time Schedule: Day shift Monday to Friday Work Location: In person
Posted 1 day ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title - S&C Global Network - AI - CDP - Marketing Analytics - Analyst Management Level: 11-Analyst Location: Bengaluru, BDC7C Must-have skills: Data Analytics Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary: This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHAT’S IN IT FOR YOU? As part of our Analytics practice, you will join a worldwide network of over 20k+ smart and driven colleagues experienced in leading AI/ML/Statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. What You Would Do In This Role A Consultant/Manager for Customer Data Platforms serves as the day-to-day marketing technology point of contact and helps our clients get value out of their investment into a Customer Data Platform (CDP) by developing a strategic roadmap focused on personalized activation. You will be working with a multidisciplinary team of Solution Architects, Data Engineers, Data Scientists, and Digital Marketers. Key Duties and Responsibilities: Be a platform expert in one or more leading CDP solutions. Developer level expertise on Lytics, Segment, Adobe Experience Platform, Amperity, Tealium, Treasure Data etc. Including custom build CDPs Deep developer level expertise for real time even tracking for web analytics e.g., Google Tag Manager, Adobe Launch etc. Provide deep domain expertise in our client’s business and broad knowledge of digital marketing together with a Marketing Strategist industry Deep expert level knowledge of GA360/GA4, Adobe Analytics, Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Assess and audit the current state of a client’s marketing technology stack (MarTech) including data infrastructure, ad platforms and data security policies together with a solutions architect. Conduct stakeholder interviews and gather business requirements Translate business requirements into BRDs, CDP customer analytics use cases, structure technical solution Prioritize CDP use cases together with the client. Create a strategic CDP roadmap focused on data driven marketing activation. Work with the Solution Architect to strategize, architect, and document a scalable CDP implementation, tailored to the client’s needs. Provide hands-on support and platform training for our clients. Data processing, data engineer and data schema/models expertise for CDPs to work on data models, unification logic etc. Work with Business Analysts, Data Architects, Technical Architects, DBAs to achieve project objectives - delivery dates, quality objectives etc. Business intelligence expertise for insights, actionable recommendations. Project management expertise for sprint planning Professional & Technical Skills: Relevant experience in the required domain. Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced, dynamic environment. Strong understanding of data governance and compliance (i.e. PII, PHI, GDPR, CCPA) Experience with analytics tools like Google Analytics or Adobe Analytics is a plus. Experience with A/B testing tools is a plus. Must have programming experience in PySpark, Python, Shell Scripts. RDBMS, TSQL, NoSQL experience is must. Manage large volumes of structured and unstructured data, extract & clean data to make it amenable for analysis. Experience in deployment and operationalizing the code is an added advantage. Experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools. Proficient in Excel, MS word, PowerPoint, etc Technical Skills: Any CDP platforms experience e.g., Lytics CDP platform developer, or/and Segment CDP platform developer, or/and Adobe Experience Platform (Real time – CDP) developer, or/and Custom CDP developer on any cloud GA4/GA360, or/and Adobe Analytics Google Tag Manager, and/or Adobe Launch, and/or any Tag Manager Tool Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Deep Cloud experiecne (GCP, AWS, Azure) Advance level Python, SQL, Shell Scripting experience Data Migration, DevOps, MLOps, Terraform Script programmer Soft Skills: Strong problem solving skills Good team player Attention to details Good communication skills Additional Information: Opportunity to work on innovative projects. Career growth and leadership exposure. About Our Company | Accenture Experience: 3-5Years Educational Qualification: Any Degree Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
What's the role? We are seeking a highly skilled and motivated Senior GIS Data Analyst/Engineer to join our innovative team in India. This role will leverage advanced expertise in GIS, data science, and programming to extract actionable insights from geospatial data, driving impactful business outcomes through cutting-edge visualization and analytical tools. Responsibilities Data Analysis and Management Conduct advanced spatial data analysis using GIS software (ArcGIS, QGIS) to derive meaningful insights. Manage, manipulate, and analyze large geospatial datasets to produce high-quality maps and actionable reports. Ensure data accuracy and integrity through rigorous quality control measures and regular audits. Programming and Automation Develop and implement Python scripts for data processing, analysis, and automation, with proficiency in SQL for querying and managing databases. Apply machine learning and AI techniques, including Generative AI, to enhance data accuracy and predictive capabilities in GIS applications. Visualization and Reporting Create compelling visualizations and interactive dashboards using Tableau, PowerBI, Matplotlib, and Seaborn to communicate complex spatial insights effectively. Leverage advanced Excel for data manipulation and reporting. Develop and maintain high-quality maps to support stakeholder presentations and decision-making. Process Efficiency and Development Design and implement efficient data analysis workflows to optimize processing and analysis tasks. Translate process efficiency concepts into development strategies, achieving significant time and effort savings. Continuously evaluate and enhance workflows to improve performance and scalability. Tool and Application Development Develop custom GIS tools and applications to address diverse business requirements. Utilize FME (Feature Manipulation Engine) for advanced data transformation and integration tasks (knowledge preferred). Look into scalable data storage, processing, and analytics. Collaboration and Support Collaborate with cross-functional teams to translate data insights into strategic business solutions. Provide technical support and training to team members on GIS tools, visualization platforms, and data analysis methodologies. Contribute to team and company objectives aligned with Business goals. Continuous Learning Stay updated on the latest advancements in GIS, data science, machine learning, Generative AI, and visualization technologies. Who are you? Bachelor’s or master’s degree in Geography, GIS, Data Science, Computer Science, or a related field. Minimum 5+ years of experience in a GIS or data analysis role. Advanced proficiency in ArcGIS and QGIS software. Strong programming skills in Python and proficiency in SQL. Proven expertise in data visualization and reporting using Tableau, PowerBI, Matplotlib, Seaborn, and advanced Excel. Hands-on experience with AWS services for data management and analytics. Familiarity with machine learning, AI, and Generative AI applications in GIS environments. Knowledge of FME (Feature Manipulation Engine) is an advantage. Exceptional analytical, problem-solving, and decision-making skills. Excellent communication, collaboration, and teamwork abilities. Ability to work independently, prioritize tasks, and manage multiple projects in a fast-paced environment. Job location: Gurgaon HERE is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, age, gender identity, sexual orientation, marital status, parental status, religion, sex, national origin, disability, veteran status, and other legally protected characteristics. Who are we? HERE Technologies is a location data and technology platform company. We empower our customers to achieve better outcomes – from helping a city manage its infrastructure or a business optimize its assets to guiding drivers to their destination safely. At HERE we take it upon ourselves to be the change we wish to see. We create solutions that fuel innovation, provide opportunity and foster inclusion to improve people’s lives. If you are inspired by an open world and driven to create positive change, join us. Learn more about us on our YouTube Channel. Show more Show less
Posted 1 day ago
0.0 years
0 Lacs
Gurugram, Haryana
Remote
Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana
Posted 1 day ago
9.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Attero is a NASA-recognized metal extraction company and end-to-end recycler of Li-Ion Batteries and E-waste headquartered in Noida and a manufacturing facility in Roorkee, Uttarakhand. Attero is amongst a handful of elite organizations globally, with the capability to extract pure metals like Lithium, Cobalt, Titanium, Nickle, Manganese, Graphite, Gold, Copper, Palladium, etc from end-of-life electronics and Lithium-ion batteries. The company is now in process of global expansion and setting up operations in India, Europe, and North America. Given the pace at which the company wants to grow, it expects employees to go beyond their defined roles to accomplish results, cooperate and collaborate with other team members, and are willing to apply innovation, and new ideas and take calculated risks like an entrepreneur. Position: Talent Acquisition Manager Location: Noida Experience: 9 to 12 Years into Corporate Hiring only Job Summary: The Talent Acquisition Lead is a strategic role responsible for leading the recruitment and talent acquisition efforts of the organization. This position involves developing and implementing innovative recruitment strategies to attract top talent, managing the entire recruitment lifecycle, and ensuring a positive candidate experience. The Talent Acquisition Lead collaborates closely with hiring managers and department heads to understand staffing needs, establish recruitment goals, and build a strong talent pipeline. Responsibilities: Develop and execute comprehensive recruitment strategies to attract high-quality candidates across all levels and functions conducting thorough candidate screenings, interviews, and assessments to evaluate qualifications, skills, and cultural fit. Collaborate with hiring managers to understand staffing needs and develop job descriptions and specifications. Utilize various recruitment channels, including job boards, social media, networking events, and employee referrals, to source potential candidates. Manage the end-to-end recruitment process, including job posting, sourcing, candidate communication, interview coordination, and offer negotiation. Build and maintain strong relationships with internal stakeholders, external recruitment agencies, and educational institutions to enhance the talent pipeline. Analyze recruitment metrics and data to assess the effectiveness of recruitment strategies and make data-driven decisions for continuous improvement. Stay updated on industry trends, best practices, and emerging technologies in talent acquisition to drive innovation and efficiency. Qualifications: Must have a Master’s degree (full time) in Human Resources, Business Administration, or a related field. Must have 9 to 12 years of proven experience in talent acquisition, recruitment, or HR, ideally in a leadership role. Should have experience managing recruitment for both technical and non-technical positions across all levels, from entry-level to the highest seniority Must have led a team of 2 - 4 Recruiters Experience in New Age, Deep Tech, Infrastructure, Manufacturing (Non-IT), or BFSI organizations, with expertise in managing volume hiring for both permanent and contractual roles, will be considered a significant advantage Must possess strong skills in Reporting, AI enabled tools, HR analytics or data analysis. Strong understanding of recruitment principles, techniques, and best practices. Excellent communication, interpersonal, and negotiation skills. Ability to build relationships and collaborate effectively with stakeholders at all levels. Demonstrated ability to manage multiple priorities and meet deadlines in a fast-paced environment. Proficiency in recruitment software, applicant tracking systems, and other HR tools. Benefits: Competitive salary and benefits package. Opportunity to lead and shape the recruitment strategy of a growing organization. Professional development opportunities and ongoing training. Dynamic and collaborative work environment. Opportunities for career advancement and growth within the organization. Show more Show less
Posted 1 day ago
0 years
0 Lacs
India
Remote
Role: Tableau Admin Duration: Long term contract with our direct client Location: Remote End-to-End Tableau Platform Management: Oversee the health, security, and performance of our entire Tableau ecosystem, including user administration, content management, and strategic platform evolution. SAP HANA Integration & Optimization: Serve as the subject matter expert for Tableau's integration with SAP HANA. Diagnose and resolve complex connectivity, data extract, and live connection issues. Collaborate with data engineering teams to optimize HANA views and data models for efficient consumption by Tableau. Advanced Performance Tuning & Optimization: Conduct comprehensive performance audits of Tableau workbooks and dashboards using tools like the Performance Recorder and log analyzers. Implement and enforce best practices for efficient workbook design, including optimizing calculations, reducing marks, and effective use of filters and parameters. Analyze and tune data source performance, including optimizing custom SQL, promoting the use of efficient connection types, and implementing effective data extract strategies. Work with developers and analysts to refactor and improve the performance of mission-critical dashboards, ensuring a fast and reliable user experience. Comprehensive Tableau Cloud Setup & Administration: Lead the architecture and administration of our Tableau Cloud environment, including strategic site creation, project organization, and content governance. Implement and manage robust user provisioning and security models, including single sign-on (SSO) integration and granular permissions. Monitor Tableau Cloud usage, performance metrics, and backgrounder/extract refresh schedules to ensure platform stability and resource optimization. Manage the Tableau Bridge client for seamless data connectivity between Tableau Cloud and on-premises data sources. Stay current with the Tableau Cloud release schedule and implement new features to enhance our analytics capabilities. User Enablement & Support: Develop and maintain documentation on best practices, provide training to business users, and act as the highest level of technical support for the Tableau platform. Show more Show less
Posted 1 day ago
7.0 years
0 Lacs
Haveli, Maharashtra, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Software Engineer - Azure Power BI Job Date: Jun 15, 2025 Job Requisition Id: 61602 Location: Pune, IN Pune, MH, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Power BI Professionals in the following areas : Job Description: The candidate should have good Power BI hands on experience with robust background in Azure data modeling and ETL (Extract, Transform, Load) processes. The candidate should have essential hands-on experience with Advanced SQL and python. Proficiency in building Data Lake and pipelines using Azure. MS Fabric Implementation experience. Additionally, knowledge and experience in Quality domain; and holding Azure certifications, are considered a plus. Required Skills: 7 + years of experience in software engineering, with a focus on data engineering. Proven 5+ year of extensive hands-on experience in Power BI report development. Proven 3+ in data analytics, with a strong focus on Azure data services. Strong experience in data modeling and ETL processes. Advanced Hands-on SQL and Python knowledge and experience working with relational databases for data querying and retrieval. Drive best practices in data engineering, data modeling, data integration, and data visualization to ensure the reliability, scalability, and performance of data solutions. Should be able to work independently end to end and guide other team members. Exposure to Microsoft Fabric is good to have. Good knowledge of SAP and quality processes. Excellent business communication skills. Good data analytical skills to analyze data and understand business requirements. Excellent knowledge of SQL for performing data analysis and performance tuning Ability to test and document end-to-end processes Proficient in MS Office suite (Word, Excel, PowerPoint, Access, Visio) software Proven strong relationship-building and communication skills with team members and business users Excellent communication and presentation skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Partner with business stakeholders to understand their data requirements, challenges, and opportunities, and identify areas where data analytics can drive value. Desired Skills: Extensive hands-on experience with Power BI. Proven experience 5+ in data analytics with a strong focus on Azure data services and Power BI. Exposure to Azure Data Factory, Azure Synapse Analytics, Azure Databricks. Solid understanding of data visualization and engineering principles, including data modeling, ETL/ELT processes, and data warehousing concepts. Experience on Microsoft Fabric is good to have. Strong proficiency in SQL HANA Modelling experience is nice to have. Business objects, Tableau nice to have. Experience of working in Captive is a plus Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders. Strong problem-solving skills and the ability to thrive in a fast-paced, dynamic environment. Responsibilities: Responsible for working with Quality and IT teams to design and implement data solutions. This includes responsibility for the method and processes used to translate business needs into functional and technical specifications. Design, develop, and maintain robust data models, ETL pipelines and visualizations. Responsible for building Power BI reports and dashboards. Responsible for building new Data Lake in Azure, expanding and optimizing our data platform and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. Responsible for designing and developing solutions in Azure big data frameworks/tools: Azure Data Lake, Azure Data Factory, Fabric Develop and maintain Python scripts for data processing and automation. Troubleshoot and resolve data-related issues and provide support for escalated technical problems. Process Improvement Ensure data quality and integrity across various data sources and systems. Maintain quality of data in the warehouse, ensuring integrity of data in the warehouse, correcting any data problems Participate in code reviews and contribute to best practices for data engineering. Ensure data security and compliance with relevant regulations and best practices. Develop standards, process flows and tools that promote and facilitate the mapping of data sources, documenting interfaces and data movement across the enterprise. Ensure design meets the requirements Education: IT Graduate (BE, BTech, MCA) preferred At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less
Posted 1 day ago
8.0 - 10.0 years
0 Lacs
Haveli, Maharashtra, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Data Modeller Job Date: Jun 15, 2025 Job Requisition Id: 61619 Location: Pune, IN Hyderabad, TG, IN Indore, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Modeling Professionals in the following areas : Experience 8-10 Years Experience: Job Description 8-10 Years Job Description: 8-10 years of data modelling experience in a large enterprise. Dimensional data modeling is a key requirement. Strong Communication Skills. Should be good at translating business requirements into conceptual, logical, and physical data models. Knowledge of Star-Schema Modelling, Snowflake Schema Modelling, Fact, and Dimension tables. Demonstrable experience in modeling using a variety of techniques (3NF, dimensional, data vault, etc.) for different data stores and use cases using MS SQL SERVER / SNOWFLAKE. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Design and implement reliable, scalable, robust, and extensible big data systems that support core products and business; Own all data modeling efforts for an Analytic Theme within our Snowflake Analytic Platform, including the design of data structures and the identification of business transformation logic. Ensure the consistency, availability, understanding, and performance of data by following and improving best practices and standards for data modeling. Partner with data SMEs, data governance staff and architects and product owners to ensure that data meets consumption needs and conforms to governance standard methodologies. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Working knowledge of customer's business processes and relevant technology platform or product. Able to analyze current-state and define to-be processes in collaboration with SME and present recommendations with tangible benefits. Requirement Gathering And Analysis: Specialized knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Extract requirements for complex scenarios and prototype independently. Identify modules impacted, features/functionalities impacted and arrive at high level estimates. Develop traceability matrix and identify transition requirements. Product/ Technology Knowledge: Implement code or configure/customize products, drive adoption of industry standards and practices, and contribute to development of reusable assets and innovative solutions. Conduct technical sessions and knowledge sharing sessions, and work on complex modules independently. Analyze various frameworks/tools and present recommendations, contribute to development of training and certification material, and demonstrate thought leadership through whitepapers and webinars. Architecture Tools And Frameworks: Specialized knowledge of architecture Industry tools & frameworks. Implement tools & framework in a complex scenario. Conduct tools & framework customization & tailoring workshop. Architecture Concepts And Principles: Specialized knowledge of architectural elements, SDLC, methodologies & customer business/domain. Establish Architectural principles/ patterns and use advanced tools to capture and analyze system/ technical issues. Analytics Solution Design: In Depth knowledge of statistical & machine learning techniques. Able to design analytical modelling approach for moderate-scale projects or for components of large-scale. Understand business requirements & constraints including potential trade-offs between speed & accuracy, maintains trends. Tools & Platform Knowledge: Familiarity with the wide range of data science/analytics commercial and open source software tools, their constraints, advantages, disadvantages, areas of application and mainstream packages relevant to technical stages of data science/analytics projects. Intermediate to advanced skills in programming languages used for data science/analytics and ability to apply these for data acquisition, pre-processing, modelling and model deployment. Ability to interpret and modify existing scripts and conduct quality checks. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of results and deadlines of the function and/or team and in completing own work. Collaboration: Reaches out to others in team to ensure connections are made and team members are working together. Looks for ways to integrate work with other teams, identifying similarities and opportunities, making necessary changes in work to ensure successful integration. Agility: Demonstrates openness to the possibilities that change presents and begins to plan for how role may change. Works with others to prepare for change. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Communicates well-organized ideas, information, and data to broad and diverse audiences across the Organization (through formal and informal presentations). Helps others identify their appropriate audience. Drives Results: Sets realistic stretch goals for self & others and perseveres to follow through with resilience and remains calm in a crisis or stressful situation to exceed organization/client expectation. Resolves Conflict: Identifies and understands the source of conflict, addresses, and overcomes. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About OnlineSales.ai Built by ex-Amazon ad-tech experts, OnlineSales.ai offers a future-proof Retail Media Operating System boosting Retailers profitability by 7% of Sales! We are an Enterprise B2B SaaS startup, based out of Pune India. With OnlineSales.ai's platform, retailers activate and delight 10x more Brands by offering an omni-channel media buying experience, advanced targeting, analytics & 2x better ROAS. Tier 1 Retailers and Marketplaces globally are accelerating their Monetization strategy with OnlineSales.ai and are innovating ahead of the market by at least 2 years. About The Role We are seeking a talented and motivated individual to join our team as a Senior Data Analyst who will be responsible for extracting insights from complex datasets to drive informed decision-making and enhance business performance. You will collaborate closely with cross-functional teams to identify key metrics, develop data-driven strategies, and provide actionable recommendations. Additional responsibilities may include managing daily regulatory reporting tasks and remediation activities, as well as process improvement. What will you do @OnlineSales ? Data Analysis : Utilize advanced analytical techniques to explore large datasets, identify trends, patterns, and anomalies, and extract actionable insights. Data Visualization : Create visually compelling dashboards and reports to communicate findings effectively to stakeholders, enabling them to make informed decisions. Data Extraction : regular extraction of relevant data from internal databases using SQL queries. Design and optimize SQL queries to retrieve specific datasets required for performance analysis and reporting. Issue Identification : Proactively identify performance-related issues by monitoring key performance indicators (KPIs), analyzing trends, and investigating anomalies reported by internal stakeholders or external clients. Addressing Client Exceptions and Issues : Responsively address performance-related exceptions and issues raised by clients, ensuring timely resolution and effective communication throughout the process. Collaborate with client-facing teams to understand client requirements, prioritize tasks, and deliver solutions that meet or exceed client expectations. Root Cause Analysis : Dive deep into data to understand the root causes of performance issues, considering factors such as system architecture, infrastructure, code efficiency, and user behavior. Hypothesis Testing : Apply hypothesis testing techniques to validate assumptions and identify statistically significant factors impacting performance. Documentation and SOP Creation : Create clear and detailed Standard Operating Procedures (SOPs) outlining the process for diagnosing, troubleshooting, and resolving performance issues. Ensure that documentation is organized, easily accessible, and regularly updated to reflect changes in systems, processes, or configurations. Cross-Functional Collaboration : Collaborate with teams across the organization, including business development, marketing, product development and operations, to understand their data needs and provide analytical support. You will be a great fit, if you have : 2 to 4 years of relevant experience. Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. Proficiency in SQL for data extraction and manipulation from relational databases. Familiarity with programming languages such as Python for Data Analysis and Data modeling is a plus. Strong analytical skills with the ability to interpret complex datasets and draw meaningful insights. Strong problem-solving abilities with a proactive approach to troubleshooting and issue resolution. Advanced proficiency in Excel and adept data manipulation skills for efficient analysis and visualization of large datasets. Effective communication and interpersonal skills for collaboration with cross-functional teams and stakeholders. Understanding of E-Commerce as a domain. Excellent documentation skills with the ability to create clear and comprehensive reports and SOPs. Attention to detail and commitment to data accuracy and quality. Willingness to work for a startup. (ref:hirist.tech) Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Tamil Nadu, India
On-site
About You – Experience, Education, Skills, And Accomplishments Years of Service - Fresher or less than 2 years of relevant experience. Education - M.Sc Chemistry / M.Sc Biochemistry/ B. Pharm / M. Pharm graduates. Preferred Qualifications- Chemical drawing packages, e.g., ISIS Draw Problem identification and solving skills, Good analytical skills. Outstanding communication skills (written and oral) with ability to communicate clearly, concisely, and objectively in both written and spoken English. What will you be doing in this role? Responsible for indexing/coding chemical compounds in patents. Extract pharmaceutical, therapeutic, agrochemical activities, chemical reaction, drug information and draw Markush etc. from patents. Achieving target volume deliverables with high efficiency and quality. Play an active role in team and maintain awareness of current trends and new developments in Pharmaceutical/Chemistry areas. In-depth knowledge of at least one structure handling tool. IUPAC nomenclature skills is added asset. Comprehensive knowledge of chemistry, incl., reactions, formulae, catalysts, additives, and their functions. Responsible for tasks as requested by manager on a permanent or temporary basis. Prioritize and complete the tasks based on situation. Maintain a flexible and adaptable approach towards process change. Collaboratively work within and other teams to carry out the tasks and to be accountable for assigned responsibility. Trusted resource in achieving the customer delight. Summary Scientific Editor - Junior/Entry level in IP Content Editing Team - Delivering value-add scientific information for DWPI™ (Derwent World Patent Index) which is a proprietary database to easily search and identify Pharma. compounds/compositions/Structures covered in patents. At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations. Show more Show less
Posted 1 day ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 17th June 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less
Posted 1 day ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 17th June 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less
Posted 1 day ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 17th June 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less
Posted 1 day ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Power BI Professionals in the following areas : Job Description The candidate should have good Power BI hands on experience with robust background in Azure data modeling and ETL (Extract, Transform, Load) processes. The candidate should have essential hands-on experience with Advanced SQL and python. Proficiency in building Data Lake and pipelines using Azure. MS Fabric Implementation experience. Additionally, knowledge and experience in Quality domain; and holding Azure certifications, are considered a plus. Required Skills 7 + years of experience in software engineering, with a focus on data engineering. Proven 5+ year of extensive hands-on experience in Power BI report development. Proven 3+ in data analytics, with a strong focus on Azure data services. Strong experience in data modeling and ETL processes. Advanced Hands-on SQL and Python knowledge and experience working with relational databases for data querying and retrieval. Drive best practices in data engineering, data modeling, data integration, and data visualization to ensure the reliability, scalability, and performance of data solutions. Should be able to work independently end to end and guide other team members. Exposure to Microsoft Fabric is good to have. Good knowledge of SAP and quality processes. Excellent business communication skills. Good data analytical skills to analyze data and understand business requirements. Excellent knowledge of SQL for performing data analysis and performance tuning Ability to test and document end-to-end processes Proficient in MS Office suite (Word, Excel, PowerPoint, Access, Visio) software Proven strong relationship-building and communication skills with team members and business users Excellent communication and presentation skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Partner with business stakeholders to understand their data requirements, challenges, and opportunities, and identify areas where data analytics can drive value. Desired Skills Extensive hands-on experience with Power BI. Proven experience 5+ in data analytics with a strong focus on Azure data services and Power BI. Exposure to Azure Data Factory, Azure Synapse Analytics, Azure Databricks. Solid understanding of data visualization and engineering principles, including data modeling, ETL/ELT processes, and data warehousing concepts. Experience on Microsoft Fabric is good to have. Strong proficiency in SQL HANA Modelling experience is nice to have. Business objects, Tableau nice to have. Experience of working in Captive is a plus Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders. Strong problem-solving skills and the ability to thrive in a fast-paced, dynamic environment. Responsibilities Responsible for working with Quality and IT teams to design and implement data solutions. This includes responsibility for the method and processes used to translate business needs into functional and technical specifications. Design, develop, and maintain robust data models, ETL pipelines and visualizations. Responsible for building Power BI reports and dashboards. Responsible for building new Data Lake in Azure, expanding and optimizing our data platform and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. Responsible for designing and developing solutions in Azure big data frameworks/tools: Azure Data Lake, Azure Data Factory, Fabric Develop and maintain Python scripts for data processing and automation. Troubleshoot and resolve data-related issues and provide support for escalated technical problems. Process Improvement Ensure data quality and integrity across various data sources and systems. Maintain quality of data in the warehouse, ensuring integrity of data in the warehouse, correcting any data problems Participate in code reviews and contribute to best practices for data engineering. Ensure data security and compliance with relevant regulations and best practices. Develop standards, process flows and tools that promote and facilitate the mapping of data sources, documenting interfaces and data movement across the enterprise. Ensure design meets the requirements Education IT Graduate (BE, BTech, MCA) preferred At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Show more Show less
Posted 1 day ago
8.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Modeling Professionals in the following areas : Experience 8-10 Years Experience: Job Description 8-10 Years Job Description: 8-10 years of data modelling experience in a large enterprise. Dimensional data modeling is a key requirement. Strong Communication Skills. Should be good at translating business requirements into conceptual, logical, and physical data models. Knowledge of Star-Schema Modelling, Snowflake Schema Modelling, Fact, and Dimension tables. Demonstrable experience in modeling using a variety of techniques (3NF, dimensional, data vault, etc.) for different data stores and use cases using MS SQL SERVER / SNOWFLAKE. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Design and implement reliable, scalable, robust, and extensible big data systems that support core products and business; Own all data modeling efforts for an Analytic Theme within our Snowflake Analytic Platform, including the design of data structures and the identification of business transformation logic. Ensure the consistency, availability, understanding, and performance of data by following and improving best practices and standards for data modeling. Partner with data SMEs, data governance staff and architects and product owners to ensure that data meets consumption needs and conforms to governance standard methodologies. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Working knowledge of customer's business processes and relevant technology platform or product. Able to analyze current-state and define to-be processes in collaboration with SME and present recommendations with tangible benefits. Requirement Gathering And Analysis: Specialized knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Extract requirements for complex scenarios and prototype independently. Identify modules impacted, features/functionalities impacted and arrive at high level estimates. Develop traceability matrix and identify transition requirements. Product/ Technology Knowledge: Implement code or configure/customize products, drive adoption of industry standards and practices, and contribute to development of reusable assets and innovative solutions. Conduct technical sessions and knowledge sharing sessions, and work on complex modules independently. Analyze various frameworks/tools and present recommendations, contribute to development of training and certification material, and demonstrate thought leadership through whitepapers and webinars. Architecture Tools And Frameworks: Specialized knowledge of architecture Industry tools & frameworks. Implement tools & framework in a complex scenario. Conduct tools & framework customization & tailoring workshop. Architecture Concepts And Principles: Specialized knowledge of architectural elements, SDLC, methodologies & customer business/domain. Establish Architectural principles/ patterns and use advanced tools to capture and analyze system/ technical issues. Analytics Solution Design: In Depth knowledge of statistical & machine learning techniques. Able to design analytical modelling approach for moderate-scale projects or for components of large-scale. Understand business requirements & constraints including potential trade-offs between speed & accuracy, maintains trends. Tools & Platform Knowledge: Familiarity with the wide range of data science/analytics commercial and open source software tools, their constraints, advantages, disadvantages, areas of application and mainstream packages relevant to technical stages of data science/analytics projects. Intermediate to advanced skills in programming languages used for data science/analytics and ability to apply these for data acquisition, pre-processing, modelling and model deployment. Ability to interpret and modify existing scripts and conduct quality checks. Accountability: Required Behavioral Competencies Takes responsibility for and ensures accuracy of results and deadlines of the function and/or team and in completing own work. Collaboration: Reaches out to others in team to ensure connections are made and team members are working together. Looks for ways to integrate work with other teams, identifying similarities and opportunities, making necessary changes in work to ensure successful integration. Agility: Demonstrates openness to the possibilities that change presents and begins to plan for how role may change. Works with others to prepare for change. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Communicates well-organized ideas, information, and data to broad and diverse audiences across the Organization (through formal and informal presentations). Helps others identify their appropriate audience. Drives Results: Sets realistic stretch goals for self & others and perseveres to follow through with resilience and remains calm in a crisis or stressful situation to exceed organization/client expectation. Resolves Conflict: Identifies and understands the source of conflict, addresses, and overcomes. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Show more Show less
Posted 1 day ago
40.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Excellent understanding, installing, administering and working knowledge of Oracle 12c through 19c databases Experience in configuring and managing Oracle RAC, Clusterware, Grid infrastructure, Oracle ASM, OEM and Shareplex, STRIIM/GoldenGate Hands-on with Oracle Database migration/upgrade using different methodologies Data Pump & RMAN Transportable Tablespaces. Hands-on database performance troubleshooting skills. DB Performance tuning skills are must. Experience with Perl, Unix Shell Scripting, and Python for custom monitoring and automation Experience with RMAN database backup and recovery Hands-on with CDB & PDB architecture, design and implementation. Experience with migration of Oracle Databases from On-premise to cloud. Experience in defining security standards and supporting security audits. Experience with DataGuard setup and management. Experience with Oracle Enterprise Manager (OEM), OEM Target Discovery, Agent installation, configure metrics and alerts. Good to have exposure in security skillset like TDE, AVDF, Data masking etc. Installing GoldenGate software and management processes; these are done in conjunction with the DBA team, as the duties are intentionally segregated Upgrading and/or patching GoldenGate versions (i.e., regular life-cycle management) Assisting in troubleshooting golden gate replication issues Configuring replications (Extract, Pump and Replicate processes), as requested by Solutions projects, including scoping, design and implementation Participating in the scope, design and implementation of Solutions projects Monitoring replications, performance and overall system health Tuning performance to ensure synchronization rates meet business requirements Applying changes to replications, such as table or column adds Refreshing data in target systems, along with the Database team when databases are refreshed or recovered. Troubleshooting the recovery of failed replications/synchronizations in order to catch up on the backlog of data in a reasonable amount of time. Strong troubleshooting abilities are an absolute must, and may require working with the Server and Storage teams in addition to database, Information Management, etc. Career Level - IC3 Responsibilities As a Senior Systems Engineer, you will interface with the customer's IT staff on a regular basis. Either at the client's site or from a remote location, you will be responsible for resolution of moderately complex technical problems related to the installation, recommended maintenance and use and repair/workarounds for Oracle products. You should be highly experienced in some Oracle products and several platforms that are being supported. You will be expected to work with only general guidance from management while advising management on progress/status. Excellent understanding, installing, administering and working knowledge of Oracle 12c through 19c databases Experience in configuring and managing Oracle RAC, Clusterware, Grid infrastructure, Oracle ASM, OEM and Shareplex, STRIIM/GoldenGate Hands-on with Oracle Database migration/upgrade using different methodologies Data Pump & RMAN Transportable Tablespaces. Hands-on database performance troubleshooting skills. DB Performance tuning skills are must. Experience with Perl, Unix Shell Scripting, and Python for custom monitoring and automation Experience with RMAN database backup and recovery Hands-on with CDB & PDB architecture, design and implementation. Experience with migration of Oracle Databases from On-premise to cloud. Experience in defining security standards and supporting security audits. Experience with DataGuard setup and management. Experience with Oracle Enterprise Manager (OEM), OEM Target Discovery, Agent installation, configure metrics and alerts. Good to have exposure in security skillset like TDE, AVDF, Data masking etc. Installing GoldenGate software and management processes; these are done in conjunction with the DBA team, as the duties are intentionally segregated Upgrading and/or patching GoldenGate versions (i.e., regular life-cycle management) Assisting in troubleshooting golden gate replication issues Configuring replications (Extract, Pump and Replicate processes), as requested by Solutions projects, including scoping, design and implementation Participating in the scope, design and implementation of Solutions projects Monitoring replications, performance and overall system health Tuning performance to ensure synchronization rates meet business requirements Applying changes to replications, such as table or column adds Refreshing data in target systems, along with the Database team when databases are refreshed or recovered. Troubleshooting the recovery of failed replications/synchronizations in order to catch up on the backlog of data in a reasonable amount of time. Strong troubleshooting abilities are an absolute must, and may require working with the Server and Storage teams in addition to database, Information Management, etc. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 day ago
40.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Excellent understanding, installing, administering and working knowledge of Oracle 12c through 19c databases Experience in configuring and managing Oracle RAC, Clusterware, Grid infrastructure, Oracle ASM, OEM and Shareplex, STRIIM/GoldenGate Hands-on with Oracle Database migration/upgrade using different methodologies Data Pump & RMAN Transportable Tablespaces. Hands-on database performance troubleshooting skills. DB Performance tuning skills are must. Experience with Perl, Unix Shell Scripting, and Python for custom monitoring and automation Experience with RMAN database backup and recovery Hands-on with CDB & PDB architecture, design and implementation. Experience with migration of Oracle Databases from On-premise to cloud. Experience in defining security standards and supporting security audits. Experience with DataGuard setup and management. Experience with Oracle Enterprise Manager (OEM), OEM Target Discovery, Agent installation, configure metrics and alerts. Good to have exposure in security skillset like TDE, AVDF, Data masking etc. Installing GoldenGate software and management processes; these are done in conjunction with the DBA team, as the duties are intentionally segregated Upgrading and/or patching GoldenGate versions (i.e., regular life-cycle management) Assisting in troubleshooting golden gate replication issues Configuring replications (Extract, Pump and Replicate processes), as requested by Solutions projects, including scoping, design and implementation Participating in the scope, design and implementation of Solutions projects Monitoring replications, performance and overall system health Tuning performance to ensure synchronization rates meet business requirements Applying changes to replications, such as table or column adds Refreshing data in target systems, along with the Database team when databases are refreshed or recovered. Troubleshooting the recovery of failed replications/synchronizations in order to catch up on the backlog of data in a reasonable amount of time. Strong troubleshooting abilities are an absolute must, and may require working with the Server and Storage teams in addition to database, Information Management, etc. Career Level - IC3 Responsibilities As a Senior Systems Engineer, you will interface with the customer's IT staff on a regular basis. Either at the client's site or from a remote location, you will be responsible for resolution of moderately complex technical problems related to the installation, recommended maintenance and use and repair/workarounds for Oracle products. You should be highly experienced in some Oracle products and several platforms that are being supported. You will be expected to work with only general guidance from management while advising management on progress/status. Excellent understanding, installing, administering and working knowledge of Oracle 12c through 19c databases Experience in configuring and managing Oracle RAC, Clusterware, Grid infrastructure, Oracle ASM, OEM and Shareplex, STRIIM/GoldenGate Hands-on with Oracle Database migration/upgrade using different methodologies Data Pump & RMAN Transportable Tablespaces. Hands-on database performance troubleshooting skills. DB Performance tuning skills are must. Experience with Perl, Unix Shell Scripting, and Python for custom monitoring and automation Experience with RMAN database backup and recovery Hands-on with CDB & PDB architecture, design and implementation. Experience with migration of Oracle Databases from On-premise to cloud. Experience in defining security standards and supporting security audits. Experience with DataGuard setup and management. Experience with Oracle Enterprise Manager (OEM), OEM Target Discovery, Agent installation, configure metrics and alerts. Good to have exposure in security skillset like TDE, AVDF, Data masking etc. Installing GoldenGate software and management processes; these are done in conjunction with the DBA team, as the duties are intentionally segregated Upgrading and/or patching GoldenGate versions (i.e., regular life-cycle management) Assisting in troubleshooting golden gate replication issues Configuring replications (Extract, Pump and Replicate processes), as requested by Solutions projects, including scoping, design and implementation Participating in the scope, design and implementation of Solutions projects Monitoring replications, performance and overall system health Tuning performance to ensure synchronization rates meet business requirements Applying changes to replications, such as table or column adds Refreshing data in target systems, along with the Database team when databases are refreshed or recovered. Troubleshooting the recovery of failed replications/synchronizations in order to catch up on the backlog of data in a reasonable amount of time. Strong troubleshooting abilities are an absolute must, and may require working with the Server and Storage teams in addition to database, Information Management, etc. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 day ago
40.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Excellent understanding, installing, administering and working knowledge of Oracle 12c through 19c databases Experience in configuring and managing Oracle RAC, Clusterware, Grid infrastructure, Oracle ASM, OEM and Shareplex, STRIIM/GoldenGate Hands-on with Oracle Database migration/upgrade using different methodologies Data Pump & RMAN Transportable Tablespaces. Hands-on database performance troubleshooting skills. DB Performance tuning skills are must. Experience with Perl, Unix Shell Scripting, and Python for custom monitoring and automation Experience with RMAN database backup and recovery Hands-on with CDB & PDB architecture, design and implementation. Experience with migration of Oracle Databases from On-premise to cloud. Experience in defining security standards and supporting security audits. Experience with DataGuard setup and management. Experience with Oracle Enterprise Manager (OEM), OEM Target Discovery, Agent installation, configure metrics and alerts. Good to have exposure in security skillset like TDE, AVDF, Data masking etc. Installing GoldenGate software and management processes; these are done in conjunction with the DBA team, as the duties are intentionally segregated Upgrading and/or patching GoldenGate versions (i.e., regular life-cycle management) Assisting in troubleshooting golden gate replication issues Configuring replications (Extract, Pump and Replicate processes), as requested by Solutions projects, including scoping, design and implementation Participating in the scope, design and implementation of Solutions projects Monitoring replications, performance and overall system health Tuning performance to ensure synchronization rates meet business requirements Applying changes to replications, such as table or column adds Refreshing data in target systems, along with the Database team when databases are refreshed or recovered. Troubleshooting the recovery of failed replications/synchronizations in order to catch up on the backlog of data in a reasonable amount of time. Strong troubleshooting abilities are an absolute must, and may require working with the Server and Storage teams in addition to database, Information Management, etc. Career Level - IC3 Responsibilities As a Senior Systems Engineer, you will interface with the customer's IT staff on a regular basis. Either at the client's site or from a remote location, you will be responsible for resolution of moderately complex technical problems related to the installation, recommended maintenance and use and repair/workarounds for Oracle products. You should be highly experienced in some Oracle products and several platforms that are being supported. You will be expected to work with only general guidance from management while advising management on progress/status. Excellent understanding, installing, administering and working knowledge of Oracle 12c through 19c databases Experience in configuring and managing Oracle RAC, Clusterware, Grid infrastructure, Oracle ASM, OEM and Shareplex, STRIIM/GoldenGate Hands-on with Oracle Database migration/upgrade using different methodologies Data Pump & RMAN Transportable Tablespaces. Hands-on database performance troubleshooting skills. DB Performance tuning skills are must. Experience with Perl, Unix Shell Scripting, and Python for custom monitoring and automation Experience with RMAN database backup and recovery Hands-on with CDB & PDB architecture, design and implementation. Experience with migration of Oracle Databases from On-premise to cloud. Experience in defining security standards and supporting security audits. Experience with DataGuard setup and management. Experience with Oracle Enterprise Manager (OEM), OEM Target Discovery, Agent installation, configure metrics and alerts. Good to have exposure in security skillset like TDE, AVDF, Data masking etc. Installing GoldenGate software and management processes; these are done in conjunction with the DBA team, as the duties are intentionally segregated Upgrading and/or patching GoldenGate versions (i.e., regular life-cycle management) Assisting in troubleshooting golden gate replication issues Configuring replications (Extract, Pump and Replicate processes), as requested by Solutions projects, including scoping, design and implementation Participating in the scope, design and implementation of Solutions projects Monitoring replications, performance and overall system health Tuning performance to ensure synchronization rates meet business requirements Applying changes to replications, such as table or column adds Refreshing data in target systems, along with the Database team when databases are refreshed or recovered. Troubleshooting the recovery of failed replications/synchronizations in order to catch up on the backlog of data in a reasonable amount of time. Strong troubleshooting abilities are an absolute must, and may require working with the Server and Storage teams in addition to database, Information Management, etc. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2