Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Gurugram, Haryana, India
Remote
Role: Spotfire Consultant Location: Remote Duration: Full Time Key Responsibilities: • Develop, automate and optimize Spotfire dashboards for data visualization and analysis. • Connect and integrate SQL databases with Spotfire, including writing and optimizing queries. • Optimize data mapping for efficient queries and seamless Spotfire integration. • Work with large datasets to ensure efficient performance in Spotfire. • Customize dashboards using IronPython scripting and HTML/CSS/Javascript. • Collaborate with internal teams to translate business requirements into actionable insights. • Troubleshoot performance issues and recommend best practices for data visualization. • Required Skills & Experience: • Strong experience with Spotfire Analyst for data visualization and analytics. • Proficiency in SQL (writing queries, stored procedures, and performance tuning). • Familiarity with database management systems (e.g., Snowflake, SQL Server, Oracle, PostgreSQL, MySQL). • Experience with HTML, JavaScript, and IronPython for dashboard customization. • Ability to work independently and communicate effectively with stakeholders. Show more Show less
Posted 3 days ago
7.0 years
0 Lacs
Gurugram, Haryana, India
Remote
K&K Talents is an international recruiting agency that has been providing technical resources globally since 1993. This position is with one of our clients in India, who is actively hiring candidates to expand their teams. Title: Technical Analyst- BI Developer/Report Writer Location: Gurugram, India (Remote first six months after that onsite) Employment Type: Full-time Permanent/C2H Must Skills: Power BI, dundas bi, tableau, cognos Required Skills: 7+ years of experience as a Report Writer, BI Developer, or SQL Developer. Advanced proficiency in SQL (MySQL, PostgreSQL, or similar RDBMS). Experience developing and maintaining reports using BI tools such as Dundas BI, Power BI, Tableau, or Cognos. Strong knowledge of data modeling techniques and relational database design. Familiarity with ETL processes, data warehousing concepts, and performance tuning. Exposure to cloud platforms (Azure, AWS) is a plus. Experience working in Agile/Scrum environments. Strong analytical and problem-solving skills Excellent communication skills and ability to work in a team environment. Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field. Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Mohali, Punjab
On-site
Company: Chicmic Studios Job Role: Python Machine Learning & AI Developer Experience Required: 3+ Years We are looking for a highly skilled and experienced Python Developer to join our dynamic team. The ideal candidate will have a robust background in developing web applications using Django and Flask, with expertise in deploying and managing applications on AWS. Proficiency in Django Rest Framework (DRF), a solid understanding of machine learning concepts, and hands-on experience with tools like PyTorch, TensorFlow, and transformer architectures are essential. Key Responsibilities Develop and maintain web applications using Django and Flask frameworks. Design and implement RESTful APIs using Django Rest Framework (DRF) Deploy, manage, and optimize applications on AWS services, including EC2, S3, RDS, Lambda, and CloudFormation. Build and integrate APIs for AI/ML models into existing systems. Create scalable machine learning models using frameworks like PyTorch , TensorFlow , and scikit-learn . Implement transformer architectures (e.g., BERT, GPT) for NLP and other advanced AI use cases. Optimize machine learning models through advanced techniques such as hyperparameter tuning, pruning, and quantization. Deploy and manage machine learning models in production environments using tools like TensorFlow Serving , TorchServe , and AWS SageMaker . Ensure the scalability, performance, and reliability of applications and deployed models. Collaborate with cross-functional teams to analyze requirements and deliver effective technical solutions. Write clean, maintainable, and efficient code following best practices. Conduct code reviews and provide constructive feedback to peers. Stay up-to-date with the latest industry trends and technologies, particularly in AI/ML. Required Skills and Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 3+ years of professional experience as a Python Developer. Proficient in Python with a strong understanding of its ecosystem. Extensive experience with Django and Flask frameworks. Hands-on experience with AWS services for application deployment and management. Strong knowledge of Django Rest Framework (DRF) for building APIs. Expertise in machine learning frameworks such as PyTorch , TensorFlow , and scikit-learn . Experience with transformer architectures for NLP and advanced AI solutions. Solid understanding of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB). Familiarity with MLOps practices for managing the machine learning lifecycle. Basic knowledge of front-end technologies (e.g., JavaScript, HTML, CSS) is a plus. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to articulate complex technical concepts to non-technical stakeholders. Contact : 9875952836 Office Location: F273, Phase 8b Industrial Area Mohali, Punjab. Job Type: Full-time Schedule: Day shift Monday to Friday Work Location: In person
Posted 3 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
TEKsystems is seeking a Senior AWS + Data Engineer to join our dynamic team. The ideal candidate should have expertise Data engineer + Hadoop + Scala/Python with AWS services. This role involves designing, developing, and maintaining scalable and reliable software solutions. Job Title: Data Engineer – Spark/Scala (Batch Processing) Location: Manyata- Hybrid Experience: 7+yrs Type: Full-Time Mandatory Skills: 7-10 years’ experience in design, architecture or development in Analytics and Data Warehousing. Experience in building end-to-end solutions with the Big data platform, Spark or scala programming. 5 years of Solid experience in ETL pipeline building with spark or sclala programming framework with knowledge in developing UNIX Shell Script, Oracle SQL/ PL-SQL. Experience in Big data platform for ETL development with AWS cloud platform. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue , EMR, DynamoDB, IAM, Secret Manager, Step functions, SQS, SNS, Cloud Watch. Excellent skills in Python-based framework development are mandatory. Should have experience with Oracle SQL database programming, SQL performance tuning, and relational model analysis. Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics/BI/DW/ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive Analytics & BI Architecture appreciation and broad experience across all technology disciplines. Experience in working within a Data Delivery Life Cycle framework & Agile Methodology. Extensive experience in large enterprise environments handling large volume of datasets with High SLAs Good knowledge in developing UNIX scripts, Oracle SQL/PL-SQL, and Autosys JIL Scripts. Well versed in AI Powered Engineering tools like Cline, GitHub Copilo Please send the resumes to nvaseemuddin@teksystems.com or kebhat@teksystems.com Show more Show less
Posted 3 days ago
0.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Noida,Uttar Pradesh,India Job ID 764325 Join our Team About this opportunity: Join our team as a Storage Engineer where you will be responsible for managing and optimizing storage solutions in both NetApp and HPE environments. Your role will encompass critical event monitoring, network interface control, and maintaining data integrity through effective storage management. This role is crucial in ensuring our storage infrastructure remains robust, efficient, and secure. What you will do: Monitor storage critical events and network interfaces. Check aggregate and volume usage using AIQUM. Configure and troubleshoot Snap Mirror and Snap Vault environments. Perform daily health checks for LUN, aggregate, and volume utilization. Manage Snap Restore and CIFS shares, and handle permissions for shared folders. Set up and troubleshoot V-Filer, NFS, CIFS, and SAN environments. Conduct ONTAP OS upgrades and manage LUN assignments. Resolve hardware issues in coordination with NetApp support. Perform volume resize, aggregate disk addition, and LUN resizing. Lun management (LUN creation, LUN snapshots, Manual and automatic igroup management, LUN restore) using Snap drive. Performed DR test for the environment. Snapshot, Lun clone, flex clone management. Setting up and troubleshooting V-Filer Environment. Setting up and troubleshooting NFS, CIFS, and SAN (FC and ISCSI) Environment. Configuring windows ISCSI Boot LUN for the environment. Managing the reports using operations manager. Working with qtree’s for efficient storage utilization for the users. Performing Deduplication for the volumes. Alias, zoning configuration for the servers connected in FC environment in brocade switches & Cisco Switches. Able to create/Expand aggregate, volume, Qtree for NAS environments, LUN’s, Igroups for SAN environments and also reclamation the LUN’s and SAN ports if any (De-commissioning projects). Provide storage infrastructure system management including capacity planning, performance monitoring and tuning, security management etc. Should Manage Tier-3 support following ITIL practice and incident management. Proven experience in complex, enterprise level NAS platform in a mission critical environment Should Leads technical assessments of hardware, software, tools, applications, firmware, middleware, and operating systems to support business operations. Strong product knowledge and troubleshooting skills on 3PAR/Primera, EVA, MSA, and nearline/StoreOnce products. Handle storage remediation tasks like HBA driver and firmware upgrades. Engage in capacity planning, performance monitoring, and tuning. Lead technical assessments and provide infrastructure support including design, planning, and project deployment. The skills you bring: Minimum 2–6 years of experience in storage engineering, with a strong focus on NetApp, HP Primera, and HPE storage systems. Willingness to work in a 24x7 operational environment with rotating shifts, including weekends and holidays, to support critical infra and ensure minimal downtime Proficiency with NDMP backups and integration with third-party products. Experience in performing disaster recovery tests and storage remediation, including HBA driver and firmware upgrades. Knowledge of HPE 3PAR/Primera, EVA/MSA, and nearline/StoreOnce products. Understanding of operating systems, virtualization, and networking. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?
Posted 3 days ago
0.0 years
0 Lacs
Ahmedabad, Gujarat
On-site
Job Information Date Opened 06/16/2025 Job Type Full time Industry Education Work Experience Fresher Salary ₹20,000 - ₹30,000 City Ahmedabad State/Province Gujarat Country India Zip/Postal Code 380060 About Us Fireblaze AI School is a part of Fireblaze Technologies which was started in April 2018 with a Vision to Up-Skill and Train in emerging technologies. Mission Statement “To Provide Measurable & Transformational Value To Learners Career” Vision Statement ““To Be The Most Successful & Respected Job-Oriented Training Provider Globally.” We Focus widely on creating a huge digital impact. Hence Our Strong Presence over Digital Platforms are a must have thing for use. Job Description Job Title : Data Analyst Location: Ahmedabad Employment Type: Full-Time (Night Shift) Salary: As per the interview Job Summary: We are looking for a detail-oriented and analytical Data Analyst who is proficient in Python, Machine Learning, SQL, Power BI, and Advanced Excel. The ideal candidate will be responsible for data wrangling, statistical modeling, visualization, and reporting to support data-driven decision-making. Key Responsibilities: Collect, process, and clean large datasets from multiple sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop and deploy Machine Learning models for predictive and classification tasks. Write optimized SQL queries to extract and manipulate data. Create interactive dashboards and reports using Power BI. Utilize Advanced Excel functions (Pivot Tables, VLOOKUP, Macros, etc.) for data analysis and reporting. Present findings to stakeholders in a clear and concise manner. Collaborate with cross-functional teams to understand business requirements and translate them into data solutions. Requirements Required Skills & Qualifications: Proficiency in Python with libraries such as Pandas, NumPy, Matplotlib, Scikit-learn. Strong knowledge of Machine Learning algorithms (Regression, Classification, Clustering, etc.). Experience with SQL for data extraction, joins, and performance tuning. Hands-on experience in building dashboards and reports using Power BI. Advanced knowledge of MS Excel including formulas, data modeling, and automation. Good problem-solving and communication skills.
Posted 3 days ago
8.0 years
0 Lacs
Pune, Maharashtra
Remote
Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30240341 Job Description Technical Engineer We are looking for a knowledgeable Linux- Sr. Technical engineer to join our team. The ideal candidate will be responsible for overseeing & ensuring the stability and performance of our Linux systems and providing high-level technical assistance to our clients. Responsibilities: Troubleshoot and resolve complex Linux system issues in a timely manner. Monitor system performance, availability, and security of Linux environments. Develop and implement support procedures and best practices for Linux systems. Collaborate with cross-functional teams to enhance system reliability and performance. Provide training and support to junior team members on Linux technologies. Manage escalated support tickets and ensure client satisfaction. Stay updated on industry trends and advancements in Linux technologies. Technical Skills: System Administration: Deep knowledge of Linux/Unix systems, including installation, configuration, and maintenance. Scripting Languages: Proficiency in Bash, Python, Perl, or other scripting languages for automation. Network Management: Understanding of networking protocols, firewall configurations, and network troubleshooting. Security: Knowledge of security best practices, including firewalls, intrusion detection systems, and access control. Virtualization and Cloud Computing: Experience with virtualization (KVM, VMware) and cloud platforms (AWS, Azure, GCP). Configuration Management: Familiarity with tools like Ansible, Puppet, or Chef for automating and managing configurations. Monitoring and Performance Tuning: Ability to monitor system performance and optimize system resources using tools like Nagios, Zabbix, or Prometheus. Storage Management: Understanding of different storage solutions and file systems, including RAID, LVM, and SAN/NAS. Containerization: Experience with Docker and Kubernetes for managing containerized applications. Requirements: Bachelor's degree in Computer Science, Information Technology, or related field. 8+ years of experience in a Linux support role. Strong knowledge of Linux operating systems and server administration. Experience with scripting languages and automation tools. Excellent problem-solving skills and the ability to work under pressure. Strong communication and interpersonal skills. Benefits: Competitive salary and performance-based bonuses. Comprehensive health, dental, and vision insurance plans. Retirement savings plan with company contributions. Professional development opportunities and training. Flexible working hours and remote work options. If you are a passionate Linux professional with a desire to lead and support a talented team, we encourage you to apply for this position.
Posted 3 days ago
0.0 years
0 Lacs
Pune, Maharashtra
On-site
Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30243649 Job Description Job Title: Oracle DBA Location: Pune Job Description: The Oracle DBA will be responsible for managing and maintaining the Oracle Database environment. This includes installation, configuration, monitoring, tuning, and troubleshooting of databases to ensure their optimal performance and security. The successful candidate will work closely with application developers, system administrators, and other stakeholders to design and implement effective database solutions that meet business needs. Key Responsibilities: Install, configure, and maintain Oracle databases in accordance with best practices. Monitor database performance and implement performance-tuning measures as necessary. Implement database security measures to ensure data integrity and confidentiality. Backup and restore databases and ensure data availability and disaster recovery plans are in place. Collaborate with application development teams to design and optimize database structures. Perform regular database health checks and capacity planning. Respond to database-related incidents and requests in a timely manner. Document database configurations, changes, and procedures. Qualifications: 6+ years of experience as an Oracle Database Administrator. Strong knowledge of Oracle database architecture, performance tuning, and backup/recovery strategies. Experience with Oracle RAC, Data Guard, and other Oracle database features. Proficient in SQL and PL/SQL. Good understanding of database security and auditing practices. Strong troubleshooting and problem-solving skills. Excellent communication and teamwork abilities. Bachelor's degree in Computer Science, Information Technology, or a related field. Preferred Skills: Oracle certification (OCA, OCP) preferred. Experience with cloud-based database solutions. Familiarity with Linux/Unix operating systems.
Posted 3 days ago
8.0 years
0 Lacs
Pune, Maharashtra
On-site
You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10346528 Date posted 06/16/2025 End Date 06/30/2025 City Pune State/Region Maharashtra Country India Additional Locations Bengaluru, Karnataka; Chennai, Tamil Nadu; Hyderabad, Andhra Pradesh; Noida, Uttar Pradesh Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Expertise SAP Basis Administrator - Pune Location - Early Joiner What does a successful Sr. SAP Basis Administrator do at Fiserv? As part of the SAP Basis Team this role will have responsibility for the SAP implementation, testing, performance, and support of on-premises and SAP Azure Cloud systems. This candidate will provide hands on leadership working individually or with small teams to support existing applications and participate in new projects. New project work will include Financial Transformation Program which is part of multiple internal initiatives focused on operational efficiency and industry best practices. The Program will evolve Fiserv to a centralized ERP business model with financial and procurement shared services based on SAP technology. What you will do: You will work with SAP basis engineers, Product managers, application development engineers, networking, database administrators learn to build, optimize and bring products to the market. User your strong technical experience working to fine tune the current landscape and optimize SAP systems for maximum reliability and productivity. Deployment of SAP support packages as a process of SAP release strategy Modification Adjustment in R/3 System Upgrades SAP Kernel, ADD on installations along with SPAM and JSPM updates JSPM administration. (deployment of JAVA support packages) SAP licensing audit, maintenance of SAP Hardware keys and maintenance keys for SLD System copies (homogeneous and heterogeneous system copies) SAP Buffer, memory management, performance tuning and troubleshooting Administration of RFC connections to SAP OSS and SLD SLD administration What you will need to have: At least 2 technical S/4 HANA implementations with direct hands-on experience with design, architecture, configuration, and deployment of SAP landscape within a Microsoft Windows server environment Knowledgeable in SAP technology (S/4, BW, BPC, Solution Manager/CHaRMs, OpenText, BOBJ/BODS, GRC, PO, CPS/BPA) SAP S/4 1909 and above experience Linux Administration and rebuild failover clusters for HANA database/SAP Applications, Azure administration, and SAP workloads in azure HANA database administration, HANA database architecture in distributed environment Hands on experience performing NetWeaver SPS upgrades and release upgrades Experience supporting SAP environments on cloud infrastructure. Strong process improvement discipline Deep knowledge of SAP and understanding of SSO Strong problem solving, error analysis, and analytical skills Working knowledge of Security administration Understanding of RDBMS structure and /or administration Nice to have exposure to SAP BTP environment What would be great to have: Bachelor’s degree in business, Computer Science, Information Technology, or equivalent job-related experience 8+ year’s technical experience with installation and support of an ERP Financial software application Ownership and accountability Independent decision making Excellent communication & interpersonal skills – written and oral Strong drive for results Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 3 days ago
0 years
0 Lacs
India
Remote
Role: Tableau Admin Duration: Long term contract with our direct client Location: Remote End-to-End Tableau Platform Management: Oversee the health, security, and performance of our entire Tableau ecosystem, including user administration, content management, and strategic platform evolution. SAP HANA Integration & Optimization: Serve as the subject matter expert for Tableau's integration with SAP HANA. Diagnose and resolve complex connectivity, data extract, and live connection issues. Collaborate with data engineering teams to optimize HANA views and data models for efficient consumption by Tableau. Advanced Performance Tuning & Optimization: Conduct comprehensive performance audits of Tableau workbooks and dashboards using tools like the Performance Recorder and log analyzers. Implement and enforce best practices for efficient workbook design, including optimizing calculations, reducing marks, and effective use of filters and parameters. Analyze and tune data source performance, including optimizing custom SQL, promoting the use of efficient connection types, and implementing effective data extract strategies. Work with developers and analysts to refactor and improve the performance of mission-critical dashboards, ensuring a fast and reliable user experience. Comprehensive Tableau Cloud Setup & Administration: Lead the architecture and administration of our Tableau Cloud environment, including strategic site creation, project organization, and content governance. Implement and manage robust user provisioning and security models, including single sign-on (SSO) integration and granular permissions. Monitor Tableau Cloud usage, performance metrics, and backgrounder/extract refresh schedules to ensure platform stability and resource optimization. Manage the Tableau Bridge client for seamless data connectivity between Tableau Cloud and on-premises data sources. Stay current with the Tableau Cloud release schedule and implement new features to enhance our analytics capabilities. User Enablement & Support: Develop and maintain documentation on best practices, provide training to business users, and act as the highest level of technical support for the Tableau platform. Show more Show less
Posted 3 days ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About company: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title: Webmethods Developer Work Mode: Hybrid Loc: Pune Experience: 7+ years Job Type: Contract to hire Notice Period: - Immediate joiners. Mandatory Skills: Webmethods, SOAP, REST, WSDL, XML, JSON JD: Required Skills and Experience • Design and develop integration solutions using WebMethods Integration Server, Trading Networks, and related components. • Implement B2B/EDI integrations with external partners. • Develop and maintain RESTful and SOAP web services. • Perform troubleshooting, debugging, and performance tuning. • Collaborate with cross-functional teams to understand integration requirements. • Ensure code quality, documentation, and adherence to best practices. • Participate in deployment, monitoring, and support of solutions in production environments. Proficient in web service technologies (SOAP, REST, WSDL, XML, JSON). Experience with database integration (JDBC, SQL). Show more Show less
Posted 3 days ago
2.0 years
0 Lacs
Kolkata metropolitan area, West Bengal, India
On-site
Hiring – Android Developer @ Underscore Technology Location : Kolkata (In-Office Only) Experience : 2+ Years Domain : System-level Android | AOSP | Custom Launchers | ROMs About Us Underscore Technology Private Limited is one of the top tech firms in Eastern India, dealing in Android, iOS, and Smart TV development, UI/UX design, AWS cloud services, digital transformation, and e-commerce solutions. We don't merely develop apps — we develop high-performance digital experiences that resonate with our customers. With our passionate team of developers, designers, and consultants, we're empowering businesses across all sizes to grow faster and smarter. What You'll Do Build & optimize a custom Launcher – make it slick, smooth, and powerful Customize ROMs – yep, we’ve run VS Code on Android Explore hidden APIs and unlock Android’s undocumented potential Tackle system-level challenges – because we go way deeper than typical app development Collaborate with a cross-functional team to create the best user experiences Build, maintain, and evolve Android apps that perform and scale Required Skills Strong command of Java and Kotlin Solid understanding of Android SDK, Jetpack libraries, and Material Design Proficiency in MVVM/MVC architecture and clean code practices Experience in using Room, Retrofit, Data Binding, LiveData, ViewModel Familiarity with Gradle, ADB, Logcat, and debugging tools Ability to build responsive UI layouts using ConstraintLayout, MotionLayout, etc. Understanding of background tasks using WorkManager, JobScheduler, Services Hands-on experience with Firebase , push notifications, and analytics Version control using Git Experience publishing apps to Google Play Store Unit testing and/or UI testing is a big plus What Makes You a Great Fit You know how Android really works (not only the app level) You've played with AOSP, Launcher3, or custom ROMs – or are excited to learn You can dig into source code and figure out what's actually happening You enjoy solving system-level issues and fine-tuning Android experiences You're a team player who enjoys working together to solve problems Show more Show less
Posted 3 days ago
0 years
0 Lacs
India
On-site
SmalBlu is the world’s first cross-layer AI platform that autonomously optimises enterprise data infrastructure across user, app, compute, network, database, and storage layers. Powered by LLM-powered agentic system and proprietary optimisation and compression technologies, SmalBlu helps enterprises cut cloud costs by 40%, carbon footprint by 35%, and query latency by 30%, while enabling deep infrastructure intelligence for CXOs. Preference: Fast learners and track owners with 2+ industry experience in their field. Part-time also works if you are currently working somewhere (30 hours per week min) AI Engineering Scope of Work: You will be responsible for building the core agentic intelligence layer of SmalBlu. This includes developing and fine-tuning ML models for optimizing compute, network, and storage layers across enterprises. You’ll integrate with LangGraph workflows, vector databases, and cost-performance datasets to enable AI-driven decision-making across the platform. Responsibilities: Build intelligent agent workflows using LangChain or LangGraph Train and fine-tune models for resource optimisation, cost prediction, and carbon analytics Integrate custom LLM pipelines with APIs and event systems ( Kafka, REST ) Work on Pinecone or other vector DBs to power infra-aware memory Skills Required: Proficiency in Python Experience with TensorFlow/PyTorch, LangChain, Pinecone Familiarity with LLMs, RAG models, vector search Strong grasp of optimization, compute, or data infra modeling About the Position: Duration: 6-month paid internship with a compensation of upto 50k per month + ESOPs , followed by full-time employment based on performance with competitive pay. Compensation & Perks: ✅ Full-time employment opportunity with full-time pay + incentives post-internship ✅ Hybrid work model ( Ahmedabad preferred ) ✅ Be a core part of a fast-growing impact-driven startup tificial Intelligence, or a related field Application Link : https://forms.gle/VUhCvyPHrw8czdkv8 You can reach out to us at garv@smalblu.ai via mail or at +91 6354057091/+91 8488006997 for any questions. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
India
Remote
If you haven’t built and scaled a 0-1 product to at least 7-figure (USD) ARR and/or been part of a YC (or equivalent startup) founding engineer team we kindly ask that you don't apply for this role. Our client is looking for a hands-on engineer who dreams about LLM solutions, keeps up with the latest AI research for fun, and has previously led technical early-stage startups in a YC batch (or an equivalent high-growth environment). About Us Our client is an innovative startup revolutionizing how businesses operate through AI and automation. With a focus on efficiency and scalability, we are building the future of intelligent workflows. To drive this mission, we are seeking a highly skilled Technical Chief of Staf f. This role is for someone deeply embedded in the startup and AI ecosystem, passionate about building AI-powered products from the ground up and integrating them into real-world applications . Role Overview As the Technical Chief of Staff, you will report directly to the CEO and play a critical role in shaping the company’s technical direction. You will build AI-powered applications end-to-end, ensuring seamless integration with existing business processes. This role requires a blend of hands-on engineering, AI research, and strategic thinking to develop scalable solutions that align with company goals. You will collaborate across departments, identifying pain points and implementing AI-driven automation to drive efficiency and innovation. This is a fully remote position. Key Responsibilities 1. AI Product Development & Execution Design, develop, and deploy AI-native applications integrating LLMs and automation. Build and maintain full-stack applications that leverage AI models for decision-making and workflow optimization. Develop and scale agentic AI workflows that automate complex business processes. Continuously refine AI-driven products to improve efficiency, usability, and impact. 2. Strategic Leadership & Cross-Functional Collaboration Partner with teams across the company to identify business challenges and design AI-driven solutions. Bridge technical execution and strategic decision-making to ensure AI initiatives align with company objectives. Develop data-driven strategies to measure and optimize the effectiveness of AI implementations. 3. AI Research & Scalability Stay engaged with the latest AI advancements, including LLMs, multi-agent systems, and emerging frameworks. Architect and deploy scalable AI infrastructure, ensuring efficiency in high-growth environments. Optimize AI models for performance, accuracy, and real-world application. 4. Industry Engagement & Thought Leadership Apply insights from thought leaders like Paul Graham, A16Z, and Sequoia Capital to inform technical and strategic initiatives. Serve as an advisor to leadership on trends in AI, automation, and product scalability. Contribute to the company’s AI thought leadership through research, whitepapers, or technical discussions. 5. Team Enablement & AI Integration Guide and mentor engineering teams on AI adoption, architecture, and optimization. Promote a culture of AI-driven experimentation and continuous learning. Train internal teams on leveraging AI-assisted coding tools such as GitHub Copilot, Claude, and Replit. Who You Are You have a strong track record of building and deploying AI-powered applications end-to-end. You have worked in high-growth startup environments and understand how to ship AI products at speed. You are a strategic thinker who can apply AI to solve real business challenges. You thrive in cross-functional collaboration, translating technical concepts for product, design, and business teams. You stay ahead of the curve in AI research and actively experiment with new tools and methodologies. Qualifications a. Technical Expertise 5+ years of experience in full-stack engineering or AI-focused software development. Strong proficiency in Python, JavaScript, React, or equivalent languages. Expertise in LLMs, AI models, and tools such as OpenAI, Hugging Face, LangChain, and vector databases. Experience integrating AI models into production environments, including APIs, fine-tuning, and retrieval-augmented generation (RAG). Proven ability to scale AI-powered products while optimizing for performance and usability. Familiarity with AI-assisted coding tools such as GitHub Copilot, Claude, and Replit. b. Startup & Industry KnowledgeExperience successfully building and scaling AI-powered products in early-stage startups. Strong understanding of startup challenges, trends, and growth strategies. Deep familiarity with thought leaders in AI and startups (e.g., A16Z, Paul Graham,Sequoia). c. Soft Skills Strong communication and collaboration abilities. Excellent organizational and project management skills. A growth mindset with a passion for AI-driven innovation. What We Offer The opportunity to build and scale AI-powered products in a fast-moving startup environment. A chance to shape the future of AI-driven automation and digital workforces. A collaborative, high-velocity culture that values experimentation and impact. Fully remote work with flexible arrangements. How to Apply Please submit your resume, portfolio, and a brief cover letter detailing: Your experience with AI, LLMs, and automation workflows. Examples of AI products you’ve built and scaled end-to-end. Your insights into the startup ecosystem and how they inform your approach to AI-driven product development. If you are passionate about building AI-native products and driving innovation in a high-growth startup, we’d love to hear from you! Show more Show less
Posted 3 days ago
0 years
0 Lacs
India
On-site
Job Summary * AS-400 exp required- 4 yr min / Looking for Admin and support/ no developer profile. We are seeking an experienced AS400 Infrastructure Operations Support Engineer to manage and support our AS400 systems. The ideal candidate will have extensive experience in AS400 infrastructure management, performance optimization, backup and recovery, security, and troubleshooting using specific tools and technologies. Key Responsibilities Infrastructure Management Installation, Configuration, and Upgrades: Set up and maintain AS400 systems, including software upgrades and patches, using tools like IBM i Access Client Solutions (ACS) and IBM Navigator for i. Storage Management: Allocate storage space, manage libraries and objects, and plan for future storage needs using tools like IBM i Disk Management and BRMS (Backup, Recovery, and Media Services). User Management: Create and manage user accounts, assign permissions, and control access to the system using IBM Security Access Manager and IBM i User Profiles. Security: Implement and maintain security measures to protect AS400 systems from unauthorized access and breaches using IBM i Security Tools and IBM Guardium. Performance Optimization Monitoring: Proactively monitor system performance, identify bottlenecks, and take corrective actions using tools like IBM Performance Tools for i and IBM i Performance Navigator. Tuning: Tune system parameters, SQL queries, and indexes to optimize performance using IBM i SQL Performance Analyzer and IBM i Performance Data Investigator. Backup and Recovery Backups: Regularly back up systems to ensure data protection and disaster recovery using tools like BRMS and IBM i Save/Restore. Recovery: Restore systems from backups and resolve data corruption issues using BRMS and IBM i Recovery Tools. Data Integrity Data Modeling: Work with data architects to design and implement efficient database schemas using tools like IBM Data Studio and IBM InfoSphere Data Architect. Data Validation: Ensure data accuracy and consistency through data validation and integrity checks using IBM InfoSphere QualityStage. Collaboration Developer Support: Collaborate with developers to design and implement new system features and applications using tools like IBM Rational Developer for i and IBM i Access Client Solutions. User Support: Provide training and support to users on how to access and use AS400 systems. Troubleshooting Problem Solving: Identify and resolve system issues, errors, and performance problems using tools like IBM i Service Tools and IBM i System Logs. Documentation: Maintain documentation of system configurations, procedures, and troubleshooting steps using tools like Confluence and SharePoint. Daily Tasks Monitor and manage the health and performance of AS400 systems. Perform routine maintenance tasks, including updates, patches, and backups. Troubleshoot and resolve technical issues related to AS400 systems. Collaborate with other IT teams and stakeholders to ensure seamless integration and operation of AS400 systems. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Java Enterprise Edition Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with business objectives, ensuring that the solutions provided are effective and efficient. Your role will require you to stay updated with industry trends and best practices to continuously improve application performance and user experience. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge. - Facilitate regular team meetings to discuss progress, challenges, and solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Enterprise Edition. - Good To Have Skills: Experience with Spring Framework, Hibernate, and Microservices architecture. - Strong understanding of software development life cycle methodologies. - Experience with application performance tuning and optimization. - Familiarity with cloud platforms and deployment strategies. Additional Information: - The candidate should have minimum 5 years of experience in Java Enterprise Edition. - This position is based in Chennai. - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Conversational AI Good to have skills : React.js, Cloud Network Operations Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and enhance application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement new features and functionalities in applications. - Collaborate with cross-functional teams to ensure successful project delivery. - Conduct code reviews and provide constructive feedback to team members. - Troubleshoot and debug applications to optimize performance. - Stay updated on industry trends and best practices to enhance application development processes. Professional & Technical Skills: (Project specific) -Dialogflow CX Conversational Developer with experience in enterprise-level conversational AI agents. -Expert in conversational design best practices Experience in NLU tuning Experience in utilizing Dialogflow CX's Generative AI capabilities - Experience in webhook development (NodeJS/Python). Git, CI/CD experience - Must To Have Skills: Proficiency in Conversational AI. - Good To Have Skills: Experience with React.js. - Strong understanding of natural language processing and machine learning algorithms. - Hands-on experience in developing chatbots and virtual assistants. - Knowledge of cloud computing platforms and network operations. - Familiarity with agile development methodologies. Additional Information: - The candidate should have a minimum of 3 years of experience in Conversational AI. - This position is based at our Gurugram office. - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Consultant - Oracle APEX Job Date: Jun 15, 2025 Job Requisition Id: 61630 Location: Bangalore, KA, IN Bangalore, KA, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Oracle APEX Professionals in the following areas : Job Description: Key Responsibilities: Design, develop, test, and document business applications using Oracle APEX. Deploy and configure Oracle APEX and Oracle REST Data Services (ORDS) on Red Hat Linux servers. Perform Linux system administration tasks, including installation, patching, memory management, and performance tuning. Use and manage RHEL command-line tools effectively for system operations and troubleshooting. Review and clarify business requirements, design technical solutions, and document ER diagrams and implementation details. Coordinate with business and IT stakeholders to ensure smooth testing and delivery. Maintain service levels in alignment with KPIs and contribute to continuous improvement of IT services. Support diverse IT systems and contribute to technology projects across development and support functions. Participate in process evaluation and enhancement of the service management framework. Mandatory Skills: 4–5 years of hands-on experience in Oracle APEX application development. Strong knowledge of Red Hat Enterprise Linux (RHEL) including: Installation and maintenance Memory and performance management Strong command-line (CLI) proficiency Proficiency in Oracle SQL, PL/SQL, and Linux shell scripting. Understanding of Software Development Life Cycle (SDLC) and ITIL practices. Ability to independently manage development and support tasks. Strong understanding of enterprise IT landscapes and application environments. Strong analytical and problem-solving skills. Preferred Skills: Red Hat certifications (e.g., RHCSA, RHCE) and Oracle APEX certifications are an advantage. Familiarity with Tomcat, WebLogic, or other application servers. Experience with DevOps tools or automation frameworks. Exposure to international, multi-location work environments. Additional Requirements: Excellent verbal and written communication skills. Candidates must be based in Bangalore or willing to relocate to Bangalore upon selection. Service-oriented mindset with flexibility to take on various IT tasks as needed. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Entity : - Accenture Strategy & Consulting Team : - Global Network - Data & AI Practice : - CMT – Software & Platforms Title : - Level 9 - Ind & Func AI Decision Science Consultant Job location : - Hyderabad/Bangalore About S&C - Global Network: - Accenture Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition About the Software & Platforms Team : - The team is focused on driving Data & AI based solutions for SaaS and PaaS clients for Accenture. The team collaborates actively with onsite counterparts to help identify opportunities for growth as well as drives client deliveries from offshore. WHAT’S IN IT FOR YOU? As part of our Data & AI practice, you will join a worldwide network of smart and driven colleagues experienced in leading statistical tools, methods, and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. Accenture will continually invest in your learning and growth. You'll work with experts in SaaS & PaaS and Accenture will support you in growing your own tech stack and certifications. In Data & AI you will understands the importance of sound analytical decision-making, relationship of tasks to the overall project, and executes projects in the context of a business performance improvement initiative. What You Would Do In This Role Gathering business requirements to create high level business solution framework aligning with business objectives and goals. Monitor project progress able to plan project plan, proactively identify risks, and develop mitigation strategies. Work closely with project leads, engineers, and business analysts to develop AI solutions. Develop & test AI algorithms and techniques tailored to solve specific business problems. Present and communicate solutions and project updates to internal & external stakeholders. Foster positive client relationships by ensuring alignment between project deliverables and client expectations. Adopt a clear and systematic approach to complex issues. Analyze relationships between several parts of a problem or situation. Anticipate obstacles and identify a critical path for a project Mentor and guide a team of AI professionals, cultivating a culture of innovation, collaboration, and excellence. Conduct comprehensive market research and stay updated on the latest advancements and trends in AI technologies. Foster the professional development of team members through continuous learning opportunities. Who are we looking for? Bachelor’s or master’s degree in computer science, engineering, data science, or a related field. Experience working for large Software or Platform organizations. Proven experience (5+ years) in working on AI projects and delivering successful outcomes. Hands-On exposure to Generative AI frameworks (Azure Open AI, Vertex AI) and implementations & Strong knowledge of AI technologies, including Embedding, prompt engineering, natural language processing, computer vision, etc. Hands on experience in building and deployment of Statistical Models/Machine Learning including Segmentation & predictive modelling, hypothesis testing, multivariate statistical analysis, time series techniques, and optimization. Proficiency in statistical packages such as R, Python, Java, SQL, Spark, etc. Ability to work with large data sets and present findings / insights to key stakeholders; Data management using databases like SQL Experience in training large language models and fine-tuning for specific applications or domains. Understanding of linguistic concepts, encompassing syntax, semantics, and pragmatics, to enhance language modeling. Experience with cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Understanding of containerization technologies (e.g., Docker) and orchestration tools (e.g., Kubernetes) for managing and deploying models & exposure of CI/CD pipelines for automated testing and deployment of language models. Excellent analytical and problem-solving skills, with a data-driven mindset. Strong project management abilities, including planning, resource management, and risk assessment. Proficient in Excel, MS word, PowerPoint, etc. Exceptional communication and interpersonal skills to engage effectively with clients and internal stakeholders. Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law. Show more Show less
Posted 3 days ago
7.0 years
0 Lacs
Haveli, Maharashtra, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Software Engineer - Azure Power BI Job Date: Jun 15, 2025 Job Requisition Id: 61602 Location: Pune, IN Pune, MH, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Power BI Professionals in the following areas : Job Description: The candidate should have good Power BI hands on experience with robust background in Azure data modeling and ETL (Extract, Transform, Load) processes. The candidate should have essential hands-on experience with Advanced SQL and python. Proficiency in building Data Lake and pipelines using Azure. MS Fabric Implementation experience. Additionally, knowledge and experience in Quality domain; and holding Azure certifications, are considered a plus. Required Skills: 7 + years of experience in software engineering, with a focus on data engineering. Proven 5+ year of extensive hands-on experience in Power BI report development. Proven 3+ in data analytics, with a strong focus on Azure data services. Strong experience in data modeling and ETL processes. Advanced Hands-on SQL and Python knowledge and experience working with relational databases for data querying and retrieval. Drive best practices in data engineering, data modeling, data integration, and data visualization to ensure the reliability, scalability, and performance of data solutions. Should be able to work independently end to end and guide other team members. Exposure to Microsoft Fabric is good to have. Good knowledge of SAP and quality processes. Excellent business communication skills. Good data analytical skills to analyze data and understand business requirements. Excellent knowledge of SQL for performing data analysis and performance tuning Ability to test and document end-to-end processes Proficient in MS Office suite (Word, Excel, PowerPoint, Access, Visio) software Proven strong relationship-building and communication skills with team members and business users Excellent communication and presentation skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Partner with business stakeholders to understand their data requirements, challenges, and opportunities, and identify areas where data analytics can drive value. Desired Skills: Extensive hands-on experience with Power BI. Proven experience 5+ in data analytics with a strong focus on Azure data services and Power BI. Exposure to Azure Data Factory, Azure Synapse Analytics, Azure Databricks. Solid understanding of data visualization and engineering principles, including data modeling, ETL/ELT processes, and data warehousing concepts. Experience on Microsoft Fabric is good to have. Strong proficiency in SQL HANA Modelling experience is nice to have. Business objects, Tableau nice to have. Experience of working in Captive is a plus Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders. Strong problem-solving skills and the ability to thrive in a fast-paced, dynamic environment. Responsibilities: Responsible for working with Quality and IT teams to design and implement data solutions. This includes responsibility for the method and processes used to translate business needs into functional and technical specifications. Design, develop, and maintain robust data models, ETL pipelines and visualizations. Responsible for building Power BI reports and dashboards. Responsible for building new Data Lake in Azure, expanding and optimizing our data platform and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. Responsible for designing and developing solutions in Azure big data frameworks/tools: Azure Data Lake, Azure Data Factory, Fabric Develop and maintain Python scripts for data processing and automation. Troubleshoot and resolve data-related issues and provide support for escalated technical problems. Process Improvement Ensure data quality and integrity across various data sources and systems. Maintain quality of data in the warehouse, ensuring integrity of data in the warehouse, correcting any data problems Participate in code reviews and contribute to best practices for data engineering. Ensure data security and compliance with relevant regulations and best practices. Develop standards, process flows and tools that promote and facilitate the mapping of data sources, documenting interfaces and data movement across the enterprise. Ensure design meets the requirements Education: IT Graduate (BE, BTech, MCA) preferred At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Description About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. TempHtmlFile About KPMG In India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. About Our Financial Crimes specialist teams provide solutions to BFSI clients by conducting model validation testing for AML risk models and frameworks, sanctions screening and transaction monitoring system to ensure efficiency and efficacy of underlying frameworks both functionally and statistically. We are looking to hire colleagues with advance data science and analytics skill to support our financial crimes team. You will play a crucial role in helping clients tackle the multifaceted challenges of financial crime. By utilizing advanced analytics and deep technical knowledge, our team aids top clients in reducing risks associated with financial crime, terrorist financing, and sanctions violations. We also work to enhance their screening and transaction monitoring systems. Our team of specialized analysts ensures that leading financial institutions adhere to industry best practices for robust programs and controls. Through a variety of project experiences, you will develop your professional skills, assisting clients in understanding and addressing complex issues, and implementing top-tier solutions to resolve identified problems. Minimum work experience: 3+ years of advance analytics Preferred experience: 1+ years in AML model validation Responsibilities Support functional SME teams to build data driven Financial Crimes solution Conduct statistical testing of the screening matching algorithms, risk rating models and thresholds configured for detection rules Validate data models of AML systems built on systems such as SAS Viya, Actimize, Lexis Nexis, Napier, etc. Develop, validate, and maintain AML models to detect suspicious activities and transactions. Conduct Above the Line and Below the Line testing Conduct thorough model validation processes, including performance monitoring, tuning, and calibration. Ensure compliance with regulatory requirements and internal policies related to AML model risk management. Collaborate with cross-functional teams to gather and analyze data for model development and validation. P erform data analysis and statistical modeling to identify trends and patterns in financial transactions. Prepare detailed documentation and reports on model validation findings and recommendations. Assist in feature engineering for improvising Gen AI prompts applicable for automation of AML / Screening related investigations Use advanced Machine Learning deployment (e.g. XGBoost) and GenAI approaches Criteria Bachelor’s degree from accredited university 3+ years of complete hands-on experience in Python with an experience in Java, Fast, Django, Tornado or Flask frameworks Working experience in Relational and NoSQL databases like Oracle, MS SQL MongoDB or ElasticSearch Proficiency BI tools such as Power BI, Tableau, etc. Proven experience in data model development and testing Education background in Data Science and Statistics Strong proficiency in programming languages such as Python, R, and SQL. Expertise in machine learning algorithms, statistical analysis, and data visualization tools. Familiarity with regulatory guidelines and standards for AML Experience in AML related model validation and testing Expertise in techniques and algorithms to include sampling, optimization, logistic regression, cluster analysis, Neural Networks, Decision Trees, supervised and unsupervised machine learning Preferred Experiences Validation of AML compliance models such as statistical testing of customer / transaction risk models, screening algorithm testing, etc. Experience with developing proposals (especially new solutions) Experience working AML technology platforms e.g. Norkom, SAS, Lexis Nexis, etc. Hands on experience with data analytics tools using Informatica, Kafka, etc. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Qualifications TempHtmlFile Bachelor’s degree from accredited university Education background in Data Science and Statistics 3+ years of complete hands-on experience in data science and data analytics Show more Show less
Posted 3 days ago
5.0 - 31.0 years
0 - 0 Lacs
Preet Vihar, New Delhi
Remote
The Job Description (JD) is looking for a Senior Android Developer who can: • Design and build Android apps • Collaborate with teams to add new features • Work with APIs (REST, JSON) • Write clean, testable code • Fix bugs and improve performance • Stay updated with Android tech trends • Has strong knowledge of Android SDK, Java/Kotlin, Git, and design principles • Has experience with different screen sizes, offline storage, and performance tuning Location: Madhuban Enclave, Preet Vihar Type: Full-Time Experience: Senior level
Posted 3 days ago
0.0 - 31.0 years
0 - 0 Lacs
Vadodara
Remote
Key Responsibilities: Evaluate AI-generated code snippets, explanations, or answers against prompts or reference solutions. Compare multiple AI responses and rank them based on correctness, efficiency, readability, and relevance. Identify and document bugs, logical errors, and inconsistencies in AI-generated code or explanations. Provide detailed feedback and quality ratings that feed directly into AI model training and fine-tuning processes. Collaborate with AI researchers, prompt engineers, and tool developers to improve evaluation workflows and data quality. Contribute to internal documentation and improvement of evaluation guidelines. Required Skills: Proficiency in front-end technologies: HTML, CSS, JavaScript, and React or similar frameworks. Familiarity with back-end development using Python, C++ or Java Experience using Git and GitHub for version control and collaborative development. Basic understanding of RESTful APIs and database systems (SQL and/or NoSQL). Strong problem-solving, analytical, and communication skills. Basic DSA
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. We are seeking a highly skilled and experienced Oracle CPQ Tech Lead to drive the technical design, development, and implementation of Oracle CPQ solutions. The ideal candidate will possess deep expertise in Oracle CPQ configuration, customization, and integration, along with strong leadership and communication skills. As a Tech Lead, you will be responsible for leading a team of developers, ensuring the delivery of high-quality solutions, and providing technical guidance to both the project team and stakeholders. You will play a critical role in shaping our CPQ strategy and ensuring its alignment with business objectives. What You’ll Do Lead the technical design, development, and implementation of Oracle CPQ solutions, ensuring adherence to best practices and architectural standards. Provide technical leadership and guidance to a team of CPQ developers, including code reviews, mentoring, and knowledge sharing. Collaborate with business analysts, functional consultants, and stakeholders to gather and analyze business requirements and translate them into technical specifications. Configure and customize Oracle CPQ to meet specific business needs, including product configuration, pricing rules, and approval workflows. Develop and maintain integrations between Oracle CPQ and other enterprise systems, such as CRM (Salesforce), ERP (Oracle EBS), and other relevant applications. Design and implement complex CPQ solutions using BML (BigMachines Extensible Language), JavaScript, and other relevant technologies. Troubleshoot and resolve complex technical issues related to Oracle CPQ. Participate in all phases of the software development lifecycle (SDLC), including requirements gathering, design, development, testing, deployment, and support. Ensure the quality and performance of CPQ solutions through unit testing, integration testing, and performance testing. Stay up-to-date with the latest Oracle CPQ releases, features, and best practices. Contribute to the development of CPQ standards, guidelines, and best practices. Manage technical risks and issues, and escalate them as appropriate. Provide technical leadership during project estimation, planning, and execution. What Experience You Need Bachelor's degree in Computer Science, Information Systems, or a related field. 3+ years of experience in Oracle CPQ implementations, with a strong understanding of CPQ concepts, architecture, and functionality. 1+ years of experience in a technical lead role, leading development teams. Deep expertise in Oracle CPQ configuration, including product configuration, pricing, rules, and document generation. Strong programming skills in BML and JavaScript. Experience with integrating Oracle CPQ with Salesforce and other Oracle Fusion systems. Solid understanding of web technologies (HTML, CSS, XML, XSLT). Experience with relational databases (e.g., Oracle, SQL Server). Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Strong understanding of software development lifecycle (SDLC) methodologies (e.g., Agile, Waterfall) What could set you apart Oracle CPQ 2024 Certified Implementation Professional. Salesforce Associate or Salesforce Certified Administrator Experience with Oracle CPQ Cloud. Experience with other Oracle Cloud applications (e.g., Sales Cloud, ERP Cloud). Experience with web services (SOAP, REST). Experience with single sign-on (SSO) and security protocols. Experience with performance tuning and optimization of CPQ solutions. Knowledge of industry best practices for CPQ implementations. Experience with CPQ implementations for specific industries (e.g., manufacturing, telecommunications, high-tech). Experience with data migration and data management related to CPQ. Strong understanding of Quote to Cash business processes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 3 days ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Total - 3 to 5 Yrs of Experience Role & Responsibilities Agentic AI Development: Design and develop multi-agent conversational frameworks with adaptive decision-making capabilities. Integrate goal-oriented reasoning and memory components into agents using transformer-based architectures. Build negotiation-capable bots with real-time context adaptation and recursive feedback processing. Generative AI & Model Optimization: Fine-t une LLMs/SLMs using proprietary and domain-specific datasets (NBFC, Financial Services, etc.). Apply distillation and quantization for efficient deployment on edge devices. Benchmark LLM/SLM performance on server vs. edge environments for real-time use cases. Speech and Conversational Intelligence: Implement contextual dialogue flows using speech inputs with emotion and intent tracking. Evaluate and deploy advanced Speech-to-Speech (S2S) models for naturalistic voice responses. Work on real-time speaker diarization and multi-turn, multi-party conversation tracking. Voice Biometrics & AI Security: Train and evaluate voice biometric models for secure identity verification. Implement anti-spoofing layers to detect deepfakes, replay attacks, and signal tampering. Ensure compliance with voice data privacy and ethical AI guidelines. Self-Learning & Autonomous Adaptation: Develop frameworks for agents to self-correct and adapt using feedback loops without full retraining. Enable low-footprint learning systems on-device to support personalization on the edge. Ideal Candidate Educational Qualifications: Bachelor’s/Master’s degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field. Experience Required: 3–8 years of experience, with a mix of core software development and AI/ML model engineering. Proven hands-on work with Conversational AI, Generative AI, or Multi-Agent Systems. Technical Proficiency: Strong programming in Python, TensorFlow/PyTorch, and model APIs (Hugging Face, LangChain, OpenAI, etc.). Expertise in STT, TTS, S2S, speaker diarization, and speech emotion recognition. LLM fine-tuning, model optimization (quantization, distillation), RAG pipelines. Understanding of agentic frameworks, cognitive architectures, or belief-desire-intention (BDI) models. Familiarity with Edge AI deployment, low-latency model serving, and privacy-compliant data pipelines. Desirable: Exposure to agent-based simulation, reinforcement learning, or behavioralmodeling. Publications, patents, or open-source contributions in conversational AI or GenAI systems. Show more Show less
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for tuning professionals in India is constantly growing, with many companies actively seeking skilled individuals to optimize and fine-tune their systems and applications. Tuning jobs can be found in a variety of industries, including IT, software development, and data management.
These cities are known for their thriving tech industries and offer numerous opportunities for tuning professionals.
The average salary range for tuning professionals in India varies based on experience and location. Entry-level roles may offer salaries starting from INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10-15 lakhs per annum.
In the field of tuning, a typical career path may include progression from Junior Tuning Specialist to Senior Tuning Engineer, and eventually to Lead Tuning Architect. With experience and expertise, professionals can take on more challenging projects and leadership roles within organizations.
In addition to tuning skills, professionals in this field are often expected to have knowledge in areas such as database management, performance optimization, troubleshooting, and scripting languages like SQL or Python.
As you navigate the job market for tuning roles in India, remember to showcase your expertise, stay updated on industry trends, and prepare thoroughly for interviews. With the right skills and mindset, you can land a rewarding career in this dynamic field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.