Jobs
Interviews

4894 Data Processing Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 5.0 years

12 - 60 Lacs

jalandhar

Work from Office

Responsibilities: * Manage computer systems and software applications * Collaborate with team on projects and tasks * Maintain accurate records and reports * Input data into database using Excel and Word

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

tiruppur, tamil nadu

On-site

As a Process Associate in our BPO team located in Gandhi Nagar, Tirupur, Udumalapet, you will be responsible for handling data conversion and processing projects. Your role will involve performing data entry, conversion, and processing tasks with utmost accuracy and attention to detail. You will collaborate with team members and supervisors to ensure timely completion of projects while maintaining the confidentiality and precision of sensitive information. Additionally, you will be expected to identify and escalate any discrepancies in the data processing process and adapt efficiently to new projects and processes. We welcome both freshers and experienced candidates to apply for this position. Strong typing speed, accuracy, and the ability to quickly learn new tools and software are essential requirements. Basic knowledge of MS Office tools such as Excel and Word will be advantageous. Effective communication, coordination skills, and the ability to work efficiently in a team setting are also key attributes we are looking for in potential candidates. In return for your contributions, we offer a competitive salary based on your experience, with training provided for freshers and ample opportunities for career growth in the BPO sector. Our work environment is friendly, supportive, and conducive to your professional development. This is a full-time position, with the possibility of part-time, fresher, or internship roles. The contract length is 3 months, with an expected workload of 30 hours per week. Benefits include paid sick time, paid time off, and a performance bonus. The work schedule is during day shift hours, and the work location is in person at our office in Gandhi Nagar, Tirupur, Udumalapet. To learn more about this opportunity and apply for the position, please contact us at +91 9688638303. We look forward to potentially welcoming you to our team as a Process Associate focused on data conversion and processing projects.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior Engineer - Perception Systems at Rebhu Computing, you will play a crucial role in leading the design and development of high-performance perception modules for next-generation robotic systems. Your responsibilities will encompass technical leadership, team and project support, engineering management, hiring, and team building. In terms of technical leadership, you will be tasked with architecting end-to-end perception systems, overseeing the integration of various perception modules with robotic platforms, and driving the optimization of real-time computer vision and sensor fusion stacks. Additionally, you will be expected to mentor junior engineers, collaborate with cross-functional teams, and ensure effective communication between internal stakeholders and clients. Setting standards for code quality, testing, and continuous integration, leading technical decision-making, and acting as a technical point of contact for clients and internal leadership are also part of your role. When it comes to hiring and team building, you will define role requirements, conduct technical interviews, and contribute to building a world-class perception team. Your expertise will help shape the engineering culture through thoughtful hiring and mentoring practices. To be successful in this role, you must have a minimum of 5 years of experience in computer vision, sensor fusion, or robotics perception, as well as proficiency in C++ and/or Python for real-time systems. Strong experience with frameworks like OpenCV, ROS, TensorRT, and hands-on experience deploying models on embedded or edge platforms are essential. A deep understanding of camera models, calibration, and real-time data processing is also required. Experience with SLAM, multi-sensor fusion, or 3D vision pipelines, familiarity with embedded Linux, GStreamer, or low-latency video pipelines, and prior leadership, mentoring, or hiring experience are considered nice-to-have qualifications. In return, Rebhu Computing offers you the opportunity to lead perception in mission-critical, real-world systems, a collaborative and intellectually vibrant work environment, as well as competitive compensation and performance-based incentives.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

gujarat

On-site

As a Data Entry Operator, your primary responsibility will be to accurately and efficiently input data into computer systems, verify its accuracy, and maintain organized records while adhering to confidentiality and data security protocols. Some of the core tasks you will be expected to perform include: - Accurate Data Entry: Inputting data from various sources such as documents and forms into databases or spreadsheets with meticulous attention to detail. - Data Verification: Comparing entered data with source documents to ensure accuracy and identify any discrepancies or errors. - Data Organization: Systematically organizing and filing data entries for easy access and retrieval to maintain a well-organized record of all inputted information. - Data Security: Handling sensitive and confidential data with care, adhering to security and privacy policies and procedures to protect the integrity of the data. - Data Processing: Timely processing of data entries to ensure that data is entered in an organized manner. - Report Generation: Assisting in the generation of reports based on the entered data, often utilizing data analysis tools. - Data Research and Collection: Conducting basic research and collecting data from various sources to support data entry tasks. - Data Updates: Monitoring and updating existing data as required to ensure that records are current and accurate. - Data Backups: Performing regular data backups to prevent data loss and maintain data integrity. This is a full-time position with benefits such as food provided, internet reimbursement, and leave encashment. The work schedule is during the day shift with a yearly bonus offered. Proficiency in English is preferred for this role, and the work location is in person.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Job Description: As a 3D Scanning Operator at Cubistry Teck Solutions, you will play a crucial role in conducting 3D scanning operations, portable CMM inspections, reverse engineering activities, and prototyping services. Your responsibilities will include setting up and operating 3D scanning equipment, ensuring data accuracy, processing scanned data, and collaborating with team members to deliver high-quality outputs. You will be based in Bengaluru, working full-time on-site to contribute to our commitment to providing innovative and precise solutions to our clients. To excel in this role, you should have experience in 3D scanning and operating related equipment, possess knowledge of portable CMM inspection and reverse engineering techniques, demonstrate proficiency in data processing and analysis, and exhibit strong attention to detail and accuracy. Your ability to work collaboratively with a team, coupled with strong communication skills and the capacity to follow instructions precisely, will be essential in meeting the demands of the role. Any experience in prototyping and product development would be advantageous, and holding a technical certification or diploma in a relevant field is preferred. Join us at Cubistry Teck Solutions, where you will be at the forefront of technology, ensuring accuracy and efficiency in all our services while contributing to our dedication to meeting the needs of our clients through innovative solutions.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at our company, you will play a crucial role in designing, building, and maintaining scalable data pipelines and systems to facilitate analytics and data-driven decision-making. Your responsibilities will include utilizing your expertise in data processing, data modeling, and big data technologies to ensure the smooth functioning of our data infrastructure. You will collaborate closely with cross-functional teams to understand their data requirements and provide efficient solutions to meet their needs. Additionally, you will be expected to stay updated on emerging trends in data engineering and continuously improve our data processes for enhanced performance and efficiency. The ideal candidate for this role is someone who is passionate about leveraging data to drive business outcomes and possesses a strong background in data engineering practices. If you are a skilled Data Engineer looking to make a significant impact in a dynamic and innovative environment, we invite you to join our team.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As a back-end developer, you will be responsible for designing, building, maintaining, and testing the back-end of applications or systems. This includes working on databases, APIs, and various processes that occur behind the scenes. Your main tasks will involve creating, testing, and debugging the back-end of applications, setting up scalable microservices, developing and managing APIs, as well as integrating AI-powered functionalities for data processing and predictive analytics. Additionally, you will utilize various tools, frameworks, and programming languages to develop user-friendly prototypes. This is a full-time, permanent position with benefits such as health insurance and provident fund. The work location is in Pimpri-Chinchwad, Maharashtra, and it requires in-person attendance. The work schedule is during the day shift from Monday to Friday. The preferred education qualification is a Bachelor's degree. If you are interested in this opportunity, you can contact the employer at +91 8483920882.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

You should hold a Bachelors or higher degree in Computer Science or a related discipline, or possess equivalent qualifications with a minimum of 4+ years of work experience. Additionally, you should have at least 1+ years of consulting or client service delivery experience specifically related to Azure Microsoft Fabric. Your role will involve 1+ years of experience in developing data ingestion, data processing, and analytical pipelines for big data. This includes working with relational databases like SQL server and data warehouse solutions such as Synapse/Azure Databricks. You must have hands-on experience in implementing data ingestion, ETL, and data processing using various Azure services such as ADLS, Azure Data Factory, Azure Functions, and services in Microsoft Fabric. A minimum of 1+ years of hands-on experience in Azure and Big Data technologies is essential. This includes proficiency in Java, Python, SQL, ADLS/Blob, pyspark/SparkSQL, and Databricks. Moreover, you should have a minimum of 1+ years of experience in working with RDBMS, as well as familiarity with Big Data File Formats and compression techniques. Your expertise should also extend to using Developer tools like Azure DevOps, Visual Studio Team Server, Git, etc. This comprehensive skill set will enable you to excel in this role and contribute effectively to the team.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Are you passionate about leveraging the latest technologies for strategic change Do you enjoy problem-solving in clever ways Are you organized enough to drive change across complex data systems If so, you could be the right person for this role. As an experienced data engineer, you will join a global data analytics team in our Group Chief Technology Officer / Enterprise Architecture organization supporting our strategic initiatives which range from portfolio health to integration. You will help the Group Enterprise Architecture team to develop our suite of EA tools and workbenches, work in the development team to support the development of portfolio health insights, build data applications from cloud infrastructure to the visualization layer, produce clear and commented code, as well as clear and comprehensive documentation. Additionally, you will play an active role with technology support teams, ensure deliverables are completed or escalated on time, provide support on any related presentations, communications, and trainings. You will need to be a team player, working across the organization with skills to indirectly manage and influence, and be a self-starter willing to inform and educate others. The mandatory skills required for this role include a B.Sc./M.Sc. degree in computing or similar, 5-8+ years of experience as a Data Engineer, ideally in a large corporate environment, in-depth knowledge of SQL and data modeling/data processing, strong experience working with Microsoft Azure, experience with visualization tools like PowerBI (or Tableau, QlikView, or similar), experience working with Git, JIRA, GitLab, a strong flair for data analytics, a strong flair for IT architecture and IT architecture metrics, excellent stakeholder interaction and communication skills, an understanding of performance implications when making design decisions to deliver performant and maintainable software, excellent end-to-end SDLC process understanding, a proven track record of delivering complex data apps on tight timelines, fluency in English both written and spoken, passion for development with a focus on data and cloud, analytical and logical mindset with strong problem-solving skills, a team player comfortable with taking the lead on complex tasks, an excellent communicator adept at handling ambiguity and communicating with both technical and non-technical audiences, comfortable with working in cross-functional global teams to effect change, and passionate about learning and developing hard and soft professional skills. Nice-to-have skills include experience working in the financial industry, experience in complex metrics design and reporting, experience in using artificial intelligence for data analytics, and proficiency in English at C1 Advanced level.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

You will play a crucial role in defining and developing Enterprise Data Structure along with Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening the modeling standards and business information. Your responsibilities will include: - Partnering with business leadership to provide strategic recommendations for maximizing the value of data assets and protecting the organization from disruptions. - Assessing benefits and risks of data using business capability models to create a data-centric view aligned with the defined business strategy. - Creating data strategies and roadmaps for Reference Data Architecture as per client requirements. - Engaging stakeholders to implement data governance models and ensure compliance with data modeling standards. - Overseeing frameworks to manage data across the organization and collaborating with vendors for system integrity. - Developing data migration plans and ensuring end-to-end view of all data service provider platforms. - Promoting common semantics and proper metadata use throughout the organization. - Providing solutions for RFPs from clients and ensuring implementation assurance. - Building enterprise technology environment for data architecture management by implementing standard patterns for data layers, data stores, and data management processes. - Developing logical data models for analytics and operational structures in accordance with industry best practices. - Enabling delivery teams by providing optimal delivery solutions and frameworks, monitoring system capabilities, and identifying technical risks. - Ensuring quality assurance of architecture and design decisions, recommending tools for improved productivity, and supporting integration teams for better efficiency. - Supporting pre-sales team in presenting solution designs to clients and demonstrating thought leadership to act as a trusted advisor. Join Wipro to be a part of an end-to-end digital transformation partner with bold ambitions and constant evolution. Realize your ambitions and design your reinvention in a purpose-driven environment. Applications from people with disabilities are explicitly welcome.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

As the Data Operations Team Lead at Consilio, you play a crucial role in ensuring the timely and efficient delivery of data processing and hosting services to our clients. Your responsibilities include overseeing day-to-day work order handling, managing a team of 6-9 associates, and serving as an escalation point for technical issues. With your deep understanding of the eDiscovery domain and Consilio Data Operations processes, you guide seniors and consultants, provide timely feedback, and prioritize tasks for assigned projects. You will collaborate closely with project managers on eDiscovery project schedules, interact with IT support teams to resolve technical issues, and report project status updates to Data Ops leadership. Your role also involves creating templates as required, delegating tasks to juniors, and ensuring queue movement and issues status updates are provided to worldwide counterparts. Additionally, you will be responsible for preparing high-impact incident reports, collaborating with IT and product management on new application upgrades, enhancing training modules, and building new sub-teams on new processes. To qualify for this role, you should ideally possess a four-year college degree, preferably in a technical discipline, along with an MBA qualification. You should have 6-9 years of related E-Discovery or strong database/scripting experience, as well as experience in working under tight deadlines and unstructured situations. Strong leadership and team management skills are essential, along with excellent verbal and written communication abilities, strong analytical and problem-solving skills, and proficiency in basic PC functions and the Windows environment. Moreover, you should have expertise in E-discovery standard products, ESI data processing platforms (such as Nuix, Venio, Reveal, etc.), ESI data hosting platforms (such as Concordance, Relativity, Introspect, etc.), and leading Text Editing tools. You will be expected to participate in regular policy and procedure review meetings, conduct knowledge sharing sessions, and contribute to the professional growth and development of the team. Join us at Consilio to be part of a collaborative and innovative work environment where you can contribute to shaping the future of our software development processes. We offer competitive salary, health, dental, and vision insurance, retirement savings plan, and professional development opportunities. Embrace our True North Values of Excellence, Passion, Collaboration, Agility, People, and Vision as we strive to make every client our advocate and win together through teamwork and communication.,

Posted 2 weeks ago

Apply

6.0 - 15.0 years

0 Lacs

pune, maharashtra

On-site

The role of Cloud Architecture and Engineering at Deutsche Bank as a Director based in Pune, India involves leading the global cloud infrastructure architecture for Corporate Bank Technology. With a focus on domains like Cash Management, Securities Services, Trade Finance, and Trust & Agency Services, you will be responsible for creating domain-level architecture roadmaps, designing solutions for TAS business, and providing technical leadership to development teams across multiple TAS Tribes and Corporate Bank Domains. Key Responsibilities: - Leading the global cloud infrastructure architecture across Corporate Bank domains - Creating domain level architecture roadmaps to ensure long-term business strategy success - Building solutions and delivering to production for TAS business - Designing blueprints for GenAI and cloud infrastructure and migration projects - Providing technical leadership to development teams and collaborating with various stakeholders - Improving team performance in SDLC, QA, CICD, and post-release activities - Serving as an expert on the platform in strategy meetings and product development opportunities - Coordinating with Lead Software Development Engineers for application release plans and architectural improvements Skills Required: - Expertise in GenAI and cloud architecture with hands-on coding skills - 15+ years of overall experience with 6+ years in building cloud applications - Strong communication skills and experience in broadcasting complex architecture strategies - Certification in GCP or AWS architecture - Deep knowledge of Architecture and Design Principles, Algorithms, Data Structures, and UI Accessibility - Familiarity with Microservices Architecture, Kubernetes, Docker, Cloud Native applications, monitoring tools, and APIs - Proficiency in GIT, Jenkins, CICD, DevOps, SRE techniques, and Information Security principles - Strong knowledge of JavaScript, React, Node, Typescript, HTML, CSS, Core Java, Spring-boot, Oracle, MySQL, Kafka, and MQ - Experience with cloud components like Big Query, Dataflow, Dataproc, DLP, Big Table, Pub/Sub, and more - Skills in building highly available distributed applications with zero-downtime release cycle and modern security ecosystems The role offers benefits such as a competitive leave policy, parental leaves, childcare assistance, flexible working arrangements, training on International Reporting Standards, sponsorship for certifications, employee assistance program, insurance coverage, and more. The supportive environment at Deutsche Bank encourages continuous learning, career progression, and a culture of collaboration and empowerment.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Solution Architect specializing in Cloud and Data Engineering with 8-12 years of experience, you will be responsible for providing architecture and design expertise as well as consulting services. Your focus will primarily be on enterprise solutions, data analytics platforms, lake houses, data engineering, data processing, data warehousing, ETL, Hadoop, and Big Data. Your role will involve defining and designing data governance, data management, and data security solutions across various business verticals within an enterprise. You should have experience working with at least one of the major cloud platforms such as AWS, Azure, or GCP. Your knowledge should extend to market insights, technology trends, and competitor intelligence. Experience in managing proposals (RFP/RFI/RFQ) will be advantageous in this position. Your key responsibilities will include developing comprehensive solutions encompassing architecture, high-level design, statements of work, service design, and bills of materials. You will be required to showcase Impetus products and services capabilities to potential clients and partners, serving as their trusted technical solution advisor. Collaborating with sales, customers, and Impetus implementation teams, you will develop use case demonstrations and proofs of concept. In addition, you will lead technical responses to RFI/RFP, contribute to thought leadership initiatives through participation in conferences, summits, events, and community involvement. Furthermore, you will play a crucial role in providing knowledge transfer to delivery teams to ensure a seamless transition from presales to delivery. Overall, as a Solution Architect in this role, you will be at the forefront of driving innovative solutions, engaging with clients, and contributing to the growth and success of the organization.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

surat, gujarat

On-site

We are looking for an experienced AI/ML cum Python Developer with 2+ years of hands-on work in machine learning, Python development, and API integration. The ideal candidate should also have experience building AI agents/smart systems that can plan tasks, make decisions, and work independently using tools like LangChain, AutoGPT, or similar frameworks. You'll be part of a collaborative team, working on real-world AI projects and helping us build intelligent, scalable solutions. Key Responsibilities: - Develop, train, and deploy machine learning models using frameworks such as TensorFlow, PyTorch, or Scikit-learn. - Develop AI agents capable of decision-making and multi-step task execution. - Write efficient and maintainable Python code for data processing, automation, and backend services. - Design and implement REST APIs or backend services for model integration. - Handle preprocessing, cleaning, and transformation of large datasets. - Evaluate model accuracy and performance, and make necessary optimizations. - Collaborate with cross-functional teams including UI/UX, QA, and product managers. - Stay updated with the latest trends and advancements in AI/ML. Key Performance Areas (KPAs): - Development of AI/ML algorithms and backend services. - AI agent development and performance. - Model evaluation, testing, and optimization. - Seamless deployment and integration of models in production. - Technical documentation and project support. - Research and implementation of emerging AI technologies. Key Performance Indicators (KPIs): - Accuracy and efficiency of AI models delivered. - Clean, reusable, and well-documented Python code. - Timely delivery of assigned tasks and milestones. - Issue resolution and minimal bugs in production. - Contribution to innovation and internal R&D efforts. Required Skills & Qualification: - Bachelors or Masters degree in Computer Science, IT, or related field. - Minimum 2 years of experience in Python and machine learning. - Hands-on with AI agent tools like LangChain, AutoGPT, OpenAI APIs, Pinecone, etc. - Strong foundation in algorithms, data structures, and mathematics. - Experience with Flask, FastAPI, or Django for API development. - Good understanding of model evaluation and optimization techniques. - Familiarity with version control tools like Git. - Strong communication and team collaboration skills. Interview Process: - HR Round. - Technical Round. - Practical Round. - Salary Negotiation. - Offer Release.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

panipat, haryana

On-site

As a System Monitoring & Maintenance professional, you will be responsible for monitoring computer systems, networks, and peripheral devices to identify errors or performance issues. Your role will involve conducting routine maintenance tasks, performing software updates, and ensuring that data backup and recovery processes are carried out as scheduled. In the event of system malfunctions or operational issues, you will be expected to promptly identify and resolve these issues. Additionally, you will collaborate with IT support teams to address hardware and software problems, respond to user queries, and provide technical assistance as needed. Data processing and entry are integral parts of this role, requiring you to accurately enter, update, and verify data in computer systems. You will also be responsible for generating reports and maintaining precise records of system operations. Maintaining security and compliance with company policies and IT security protocols is crucial. You will be tasked with monitoring systems for unauthorized access and potential security threats to ensure the protection of sensitive data. Your responsibilities will also include running batch processing and scheduled jobs, as well as operating printers, scanners, and other peripheral devices. This role offers a full-time, permanent position with day shift, fixed shift, and morning shift options available. Proficiency in English is preferred for this role, and the work location is in person.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chandigarh

On-site

As a Data Engineer, you will provide support to the Global BI team for Isolation Valves in their migration to Microsoft Fabric. Your primary focus will be on data gathering, modeling, integration, and database design to facilitate efficient data management. Your responsibilities will include developing and optimizing scalable data models to meet analytics and reporting requirements and utilizing Microsoft Fabric and Azure technologies for high-performance data processing. In this role, you will collaborate with cross-functional teams, including data analysts, data scientists, and business collaborators, to understand their data needs and deliver effective solutions. You will leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives. Expertise in data modeling, with a specific emphasis on data warehouse and lakehouse design, will be essential. You will be responsible for designing and implementing data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Additionally, you will develop ETL processes using tools such as SQL Server Integration Services (SSIS) and Azure Synapse Pipelines to prepare data for analysis and reporting. Implementing data quality checks and governance practices to ensure data accuracy, consistency, and security will also be part of your role. Your tasks will involve supervising and optimizing data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Proficiency in Business Intelligence (BI) tools like Power BI and Tableau, along with experience in data integration and ETL tools such as Azure Data Factory, will be beneficial. You are expected to have expertise in Microsoft Fabric or similar data platforms and a deep understanding of the Azure Cloud Platform, particularly in data warehousing and storage solutions. Strong communication skills are essential, as you will need to convey technical concepts to both technical and non-technical stakeholders. The ability to work independently as well as within a team environment is crucial. Preferred qualifications for this role include 3-5 years of experience in Data Warehousing with on-premises or cloud technologies, strong analytical abilities, and proficiency in database management, SQL query optimization, and data mapping. A willingness to work flexible hours based on project requirements, strong documentation skills, and advanced SQL skills are also required. Hands-on experience with Medallion Architecture for data processing, prior experience in a manufacturing environment, and the ability to quickly learn new technologies are advantageous. Travel up to 20% may be required. A Bachelor's degree or equivalent experience in Science, with a focus on MIS, Computer Science, Engineering, or a related field, is preferred. Good interpersonal skills in English for efficient collaboration with overseas teams and Agile certification are also desirable. At Emerson, we value an inclusive workplace where every employee is empowered to grow and contribute. Our commitment to ongoing career development and fostering an innovative and collaborative environment ensures that you have the support to succeed. We provide competitive benefits plans, medical insurance options, employee assistance programs, recognition, and flexible time off plans to prioritize employee wellbeing. Emerson is a global leader in automation technology and software, serving industries such as life sciences, energy, power, renewables, and advanced factory automation. We are committed to diversity, equity, and inclusion, and offer opportunities for career growth and development. Join our team at Emerson and be part of a community dedicated to making a positive impact through innovation and collaboration.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description Seeking a skilled and detail-oriented OAS/OBIEE Consultant to join our data and analytics team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence (BI) and dashboarding solutions to support smelter operations and decision-making processes. You will work closely with cross-functional teams to transform raw data into actionable insights using modern BI tools and ETL processes. Key Responsibilities: Develop and maintain interactive dashboards and reports using Microsoft Power BI and Oracle Analytics . Design and implement ETL processes using Oracle Data Integrator and other tools to ensure efficient data integration and transformation. Collaborate with stakeholders to gather business requirements and translate them into technical specifications. Perform data analysis and validation to ensure data accuracy and consistency across systems. Optimize queries and data models for performance and scalability. Maintain and support Oracle Database and other RDBMS platforms used in analytics workflows. Ensure data governance, quality, and security standards are met. Provide technical documentation and user training as needed. Required Skills and Qualifications: Proven experience in BI solutions , data analysis , and dashboard development . Strong hands-on experience with Microsoft Power BI , Oracle Analytics , and Oracle Data Integrator . Proficiency in Oracle Database , SQL , and relational database concepts. Solid understanding of ETL processes , data management , and data processing . Familiarity with business intelligence and business analytics best practices. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Experience in the smelting or manufacturing industry is a plus. Knowledge of scripting languages (e.g., Python, Shell) for automation. Certification in Power BI, Oracle Analytics, or related technologies.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Join the team at Anteriad and help revolutionize the way B2B marketers utilize data to make informed business decisions. Anteriad is not just a typical B2B solution provider; we are problem solvers who believe that data is the cornerstone of effective solutions across various marketing challenges. Our dedicated team is focused on developing impactful solutions that yield tangible results for our clients, whether through cutting-edge technology or in-depth analysis. As a member of our Quality Control team, you will collaborate with a dynamic group to deliver top-tier solutions to Fortune 500 companies. This role is ideal for individuals who are intelligent, proactive, and driven. Your responsibilities will include ensuring the quality of deliverables to customers and striving to minimize rejection rates through a comprehensive approach to third-party research as per client requirements. At Anteriad, we embody a culture of constant progress, fostering an environment that empowers our employees to excel. We offer continuous training and development opportunities through the Cornerstone Learning System, ensuring a steady, full-time role that combines collaborative teamwork with independent tasks. Our commitment to community outreach, professional mentoring, and employee resource groups allows our team members to engage with like-minded individuals and grow both personally and professionally. As part of the Quality Control team, your duties will involve verifying and validating lead data, conducting thorough online research to verify data accuracy, and providing regular updates on rejection rates to internal stakeholders. You will be accountable for ensuring the quality of final data deliveries, including eliminating non-qualified lists based on targeting criteria. Collaboration with key departments such as Planning, Process Management, Fulfilment, and Support Managers will be essential to enhance the Anteriad customer experience. To excel in this role, you must be highly organized, self-motivated, and adept at managing multiple projects of varying complexities while meeting deadlines. Proficiency in market/industry research analysis, Microsoft Excel, data processing, data appending, web browsing, and Google search is essential. Strong communication skills, both written and verbal, along with excellent analytical abilities are key attributes we are looking for in potential candidates. At Anteriad, we uphold the following values: - Lead & Learn: We strive to lead with vision, innovation, and continuous learning to stay ahead in our industry. - Collaborate & Celebrate: Achieving greatness is a collective effort, and we celebrate accomplishments, big and small, as one Anteriad team. - Innovate & Inspire: Our goal is to exceed customer expectations by embracing bold new ideas and inspiring each other towards greater success. - Do More & Do Good: We go the extra mile to serve our clients, colleagues, and communities, embodying a spirit of dedication and goodwill in all that we do. If you are ready to be part of a forward-thinking team that values innovation, collaboration, and excellence, Anteriad welcomes your expertise and enthusiasm to contribute to our mission of driving impactful solutions through data-driven strategies.,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

hyderabad

Work from Office

The Lead Data Strategy & Market Intelligence will drive innovation in how we interpret and present global trade intelligence. This role goes beyond analytics delivery it requires proactively discovering new opportunities, integrating fresh datasets, and working hand-in-hand with product and web teams to deliver impactful, client-facing solutions. The position reports directly to the CEO. Key Responsibilities: Innovative Data Interpretation: Continuously brainstorm and propose new methods of data analysis, visualization, and storytelling that add value to our platforms. Dataset Expansion: Research, evaluate, and integrate new public and proprietary data sources to strengthen our intelligence products. Data Processing & ETL: Lead the ingestion, cleaning, transformation, and structuring of global trade, company, and financial datasets. Product Development: Co-create features like Ex-Im Edge insights and RRR Ratings in collaboration with the Web team, ensuring analytics are embedded into the user experience. AI Model Support: Prepare and maintain high-quality training datasets for AI-based company name mapping and classification. Sales Enablement: Build and maintain live dashboards that prioritize, and score leads for our 600+ agent sales operation. Collaboration: Work closely with the Web and Product teams to ensure seamless integration of analytics into platform features. Direct Reporting: Provide strategic input and progress updates directly to the CEO. Process Governance: Document and implement SOPs for data governance, accuracy checks, and reporting workflows. Performance Expectations First 90 Days Deliver pilot Ex-Im Edge dashboard and RRR Ratings prototype. Propose and integrate at least two new external datasets into our intelligence stack. Create gold-standard dataset for AI model training. Build and deploy live lead-scoring dashboard for sales. Present at least three innovative data interpretation ideas to management for potential product features. Required Skills and Qualifications: Solid experience in data management, commercial intelligence, or demand generation in B2B companies. Experience in team leadership and management of cross-functional projects. Analytical skills to turn data into actionable insights and campaign strategies. Excellent verbal and written communication skills, with strong stakeholder management abilities. Demonstrates all core management responsibilities: envision the future; build strong teams; develop people; manage performance. Strong organizational skills, ownership mindset, and focus on operational efficiency. Previous experience in tech, SaaS, or B2B consulting companies. Knowledge of data tools such as ZoomInfo, Apollo, Lusha, Clear bit, among others. Familiarity with marketing automation tools (eg, Marketo, Eloqua). Understanding of ABM (Account-Based Marketing) and data-driven strategies

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

hyderabad

Work from Office

Job Summary We are looking for a senior DevOps engineer to help build, automate, and validate end-to-end machine learning pipelines at enterprise scale. This role combines responsibilities across DevOps engineering and Quality Engineering (QE), enabling smooth CI/CD workflows while ensuring robust test automation, release quality, and operational excellence. You ll work closely with data scientists, ML engineers, platform teams, and product managers to operationalize AI/ML workflows, enabling smooth transitions from experimentation to production. You ll also define test strategies, automate validation pipelines, and champion the overall quality of Teradata s AI/ML platform and analytic products. What You ll Do Collaborate with AI/ML teams to automate and maintain automated test frameworks for Teradata features, SQL-based components, and analytics functions. Define and implement end-to-end test strategies for analytic products. Own quality gates in CI pipelines to block releases with critical bugs. Collaborate with the Agentic AI team to validate models used by intelligent agents (e.g., LLM-based systems). Who Youll Work With You ll collaborate with: AI/ML engineers and data scientists building enterprise-grade models and intelligent agents. Product managers and quality leaders defining success criteria and customer expectations for model-based features. Release management teams ensuring delivery standards and model lifecycle hygiene. What We re Looking For Minimum Requirements 5+ years of industry experience in QA, DevOps, or software engineering role. Solid coding skills in Python, including test frameworks (e.g., PyTest, unittest) and data libraries (pandas, NumPy). Strong experience in SQL and databases. Experience in building and running test automation in a CI/CD pipeline. Experience in AWS, Azure, or Google Cloud. Familiarity with containerization (Docker) and cloud platforms (AWS, GCP, or Azure). Working knowledge of Linux-based systems and networking fundamentals. Preferred Qualifications Bachelor s or Master s in Computer Science, Artificial Intelligence, or a related field (or equivalent experience). Familiarity with Teradata Vantage, model scoring, or cloud-native deployments (AWS/GCP/Azure). Prior experience testing or validating ML or analytics-based applications. Experience in testing data pipelines, ETL flows, or large-scale data processing systems. Strong communication and documentation skills; ability to write test plans and share findings with both technical and non-technical audiences #

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

navi mumbai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain robust data pipelines to support data processing and analytics.- Collaborate with data scientists and analysts to understand data needs and provide appropriate solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and processes for data migration and transformation.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

0.0 - 6.0 years

2 - 8 Lacs

bengaluru

Work from Office

About Our Team At Zysk Tech , a passionate and innovative web development startup based in Bangalore, India, we focus on crafting exceptional web and mobile experiences. Our young, diverse team consists largely of first-job professionals who thrive in a creative, collaborative, and continuously learning environment. If excited by technology and eager to make an impact, Zysk is the right place to grow your career. Skills Needed Proficient use of ChatGPT and AI tools for data entry and automation Strong internet research capabilities to find and verify information accurately Good written and verbal communication skills High accuracy and attention to detail in data entry tasks Key Responsibilities Perform accurate and efficient data entry using ChatGPT and AI tools to automate routine tasks Conduct online research to gather and verify relevant data Ensure data integrity and correct any errors during data processing Collaborate with team members to improve data entry processes using AI-based solutions Maintain confidentiality and security of sensitive data Employment Details Employment Type: 6 months Salary: 2.4 LPA Work Location: On-site

Posted 2 weeks ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

chennai

Work from Office

The Module Configuration and Programming (MCP) Diagnostic Designer Engineer is primarily responsible to design and develop, validate, and implement, programming applications and delivers software files and configuration data to the dealer/aftermarket network to support vehicle. The MCP Diagnostic Design Engineer will be using the OTX /Grade X authoring software and other XML scripting tools to develop the programming applications Experience and Skills Required: BS in Computer Science, Computer Engineering, Electrical Engineering, or Automotive Technology preferred. Dealership field experience working Service Departments and Repair Technicians is a plus. Strong desire to enhance and improve the technician s diagnostic experience. Project development experience with Ford Agile/Rally/JIRA is a plus. Strong Program Management skills. Strong verbal and written communication skills. Ability to work close with global teams to communicate and agree on desired outcomes. Ability to communicate with tool owners to drive changes that relate to both HMI and UX Main responsibilities include: Develop diagnostic solutions for Over the Air (OTA) supported connected vehicles. Interface with Product Engineering regarding new control system technologies requiring new diagnostics service procedures and support. Use Rally to document Features and User Stories to add new content to FDRS. Design, Develop, Test, Release, all features of MCP that relate to software compatibility (for example, operating system, vehicle protocol / communication drivers, databases, libraries) on the two diagnostic service tools using OTX/Grade X programming, XML scripting tools and software programming tools Manage and understand data, process data requests and data concerns, maintain templates. Interface with Software/Hardware Designer/Supplier to assure design intent is met. Design and implement new tools for dealership diagnostic support. Validation will be performed with both simulation tools and on vehicle validation.

Posted 2 weeks ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

chennai

Work from Office

The Module Configuration and Programming (MCP) Diagnostic Designer Engineer is primarily responsible to design and develop, validate, and implement, programming applications and delivers software files and configuration data to the dealer/aftermarket network to support vehicle. The MCP Diagnostic Design Engineer will be using the OTX /Grade X authoring software and other XML scripting tools to develop the programming applications Experience and Skills Required: BS in Computer Science, Computer Engineering, Electrical Engineering, or Automotive Technology preferred. Dealership field experience working Service Departments and Repair Technicians is a plus. Strong desire to enhance and improve the technician s diagnostic experience. Project development experience with Ford Agile/Rally/JIRA is a plus. Strong Program Management skills. Strong verbal and written communication skills. Ability to work close with global teams to communicate and agree on desired outcomes. Ability to communicate with tool owners to drive changes that relate to both HMI and UX Main responsibilities include: Develop diagnostic solutions for Over the Air (OTA) supported connected vehicles. Interface with Product Engineering regarding new control system technologies requiring new diagnostics service procedures and support. Use Rally to document Features and User Stories to add new content to FDRS. Design, Develop, Test, Release, all features of MCP that relate to software compatibility (for example, operating system, vehicle protocol / communication drivers, databases, libraries) on the two diagnostic service tools using OTX/Grade X programming, XML scripting tools and software programming tools Manage and understand data, process data requests and data concerns, maintain templates. Interface with Software/Hardware Designer/Supplier to assure design intent is met. Design and implement new tools for dealership diagnostic support. Validation will be performed with both simulation tools and on vehicle validation.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

5 - 7 Lacs

kolkata, mumbai, new delhi

Work from Office

Identify, analyze, and interpret trends and patterns in complex data sets. Use data to drive business insights and generate regular, insightful reports. Develop and implement databases, data pipelines, and analytics systems to improve efficiency and data quality. Extract data from primary and secondary sources, and maintain databases/data systems. Clean, filter, and validate data to ensure consistency and accuracy. Collaborate with cross-functional teams and management to define data needs and prioritize information requirements. Support process optimization initiatives using data-driven strategies. Apply AI (basic to intermediate) to automate data processing and enhance analysis Requirements 2-5 years of proven experience as a Data Analyst or in Business Data Analysis Strong command over Excel , Power BI , Tableau , and other data visualization tools Proficiency in SQL , data querying, and working with large relational databases Exposure to AI/ML tools or platforms (e.g., Python for data science , AutoML , ChatGPT API integration, or similar) will be an addition. Strong analytical and problem-solving skills with high attention to detail and data accuracy. Excellent communication and presentation skills with the ability to explain complex insights to non-technical stakeholders. Bachelors degree in Computer Science, Data Science, Statistics, Engineering, or related field. Ability to work flexibly, multitask, and exercise sound judgment in a fast-paced environment.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies