No encontramos ningún resultado.

Prueba con otro grupo de palabras.

Si crees que esto es un error, por favor contáctanos.

Gross salary $2500 - 3000 Tiempo completo
Lead Data Scientist – GenAI & Applied ML
  • Niuro
Python SQL Django DevOps
Niuro is a company dedicated to connecting elite technology teams with leading companies in the U.S. We specialize in providing autonomous, high-performance tech teams focused on delivering innovative solutions globally. Our projects typically involve industrial data initiatives that demand technical excellence and innovation. The role will contribute to advanced machine learning and data science projects that aim to drive business value and technological advancement for top-tier U.S. clients. Our teams benefit from a collaborative environment with continuous professional development and administrative support, enabling a focus on impactful and challenging work.

This job is published by getonbrd.com.

Key Responsibilities

As a Data Scientist & Machine Learning Engineer at Niuro, you will play a crucial role in developing, deploying, and maintaining machine learning models and data-driven solutions. Your responsibilities include:
  • Designing and implementing machine learning pipelines and models using Python with a focus on production readiness.
  • Collaborating with cross-functional teams to understand data requirements and deliver scalable solutions that solve complex problems.
  • Integrating ML models into production environments with the help of MLOps practices and DevOps methodologies.
  • Working with large language models (LLMs) and Retrieval-Augmented Generation (RAG) techniques to enhance conversational AI and data accessibility.
  • Monitoring model performance, retraining and optimizing as needed to maintain high accuracy and robustness.
  • Documenting workflows and solutions clearly for team communication and knowledge sharing.

Required Qualifications and Skills

We are seeking an expert Python developer with proven hands-on experience and certification to lead our machine learning projects. The ideal candidate must also possess strong expertise in ML and Data Science, demonstrating a solid understanding of algorithms, data processing, and statistical modeling.
Additional required skills include familiarity with MLOps and DevOps practices to streamline development and deployment cycles effectively. Experience with large language models (LLMs) and Retrieval-Augmented Generation (RAG) is important to support our conversational AI initiatives.
Effective communication skills in English at a conversational level are necessary to collaborate efficiently within our distributed, international teams.
Technical requirements: Expert-level Python programming skills (certified preferred); solid experience with machine learning frameworks and libraries; familiarity with software development best practices in an ML context.

Preferred Skills & Additional Experience

While not mandatory, the following skills would be valuable:
  • Experience with Python web frameworks such as Django and FastAPI.
  • Working knowledge of SQL databases for data querying and manipulation.
  • Practical experience with Docker containerization to support reproducible environments.
  • Familiarity with version control using Git, and Python data validation tools like Pydantic.
  • Understanding of the Gunicorn application server for deploying Python web applications.

Benefits and Work Environment

At Niuro, you will have the opportunity to work 100% remotely from LATAM, providing full flexibility and work-life balance. We offer competitive compensation aligned with experience and skills.
We invest in continuous training and leadership development to help you grow professionally while working on impactful and technically rigorous data projects.
We foster a culture of collaboration, innovation, and technical excellence, supported by a robust administrative infrastructure so you can focus purely on your technical challenges.
Upon successful completion of the initial contract, there is a strong opportunity for long-term, stable full-time employment with Niuro.

Informal dress code No dress code is enforced.
POSTULAR VÍA WEB Ver trabajo
Gross salary $3000 - 4000 Tiempo completo
Data Engineer / Machine Learning Engineer
  • Lilo AI
Python PostgreSQL SQL NoSQL
Lilo AI is an innovative startup dedicated to transforming procurement for Commercial Real Estate (CRE) businesses by creating the most hassle-free procurement platform globally. Our platform leverages artificial intelligence to automate and optimize various procurement workflows, including invoicing, vendor management, and price comparisons. Serving diverse sectors such as hotels, gyms, schools, and senior living homes, our solutions save clients valuable time and money while improving operational efficiency. Our major clients include prestigious brands such as Fairfield, Hampton Inn, and Hilton. By joining Lilo AI, you will contribute to revolutionizing procurement processes at scale through cutting-edge AI technologies.

This job offer is on Get on Board.

About the Role

As a Data Engineer / Machine Learning Engineer at Lilo AI, you will play a pivotal role in advancing and deploying machine learning solutions that enhance our procurement platform. Your primary responsibilities will include:
  • Designing, developing, and implementing machine learning models to optimize procurement workflows, such as price prediction algorithms, anomaly detection systems, and recommendation engines.
  • Building and maintaining robust data pipelines to efficiently preprocess and cleanse both structured and unstructured datasets.
  • Collaborating closely with engineers, product managers, and business stakeholders to integrate AI-driven insights seamlessly into our platform environment.
  • Optimizing model performance and reliability, including contributing to monitoring strategies and retraining pipelines to sustain production quality.
  • Keeping abreast of the latest developments in AI, machine learning, and data science to continually enhance our technology stack.
  • Supporting the deployment of machine learning models into production environments and improving MLOps workflows to increase operational efficiency.
This role requires a proactive mindset, a passion for AI applications in real-world business contexts, and the ability to thrive in a dynamic, fast-paced, and collaborative global team.

What You Will Need

To succeed in this role, candidates should demonstrate strong technical proficiency and relevant experience as detailed below:
  • A minimum of 2 years of professional experience in machine learning, data science, or closely related fields.
  • Proficiency in Python programming and familiarity with prominent ML frameworks and libraries such as Scikit-Learn, TensorFlow, or PyTorch.
  • Hands-on experience with both SQL and NoSQL databases, including but not limited to MongoDB and PostgreSQL.
  • Solid understanding of data preprocessing techniques, feature engineering, and model evaluation methodologies essential for robust ML model development.
  • Basic knowledge or experience with containerization (Docker), cloud computing platforms (AWS, Google Cloud Platform, or Azure), and MLOps tools is highly desirable.
  • Analytical mindset with strong problem-solving skills, able to handle and extract insights from large datasets effectively.
  • A continuous learner attitude, eager to develop technical and professional skills while contributing to team goals.
We value curiosity, collaboration, and adaptability, wanting individuals who are ready to grow alongside our rapidly expanding company.

Desirable Skills and Experience

While not mandatory, the following skills and experiences will give candidates an edge:
  • Experience working with time-series data, demand forecasting, or procurement-related AI models.
  • Familiarity with advanced ML techniques such as deep learning, reinforcement learning, or natural language processing.
  • Hands-on exposure to MLOps pipelines, automated model deployment, and monitoring platforms.
  • Knowledge of data engineering tools and frameworks like Apache Airflow, Spark, or Kafka.
  • Prior experience in the hospitality or Commercial Real Estate sectors.
  • Strong communication skills to effectively articulate technical concepts to non-technical stakeholders.

Why Lilo AI?

Joining Lilo AI offers a unique opportunity to make a significant impact at a fast-growing US-based startup while enjoying the flexibility of remote work from Latin America. We provide:
  • A high-impact role within a pioneering team revolutionizing procurement with AI.
  • Possibility to grow professionally with opportunities to increase responsibilities over time.
  • Stock options available for the right candidate, sharing in our long-term success.
  • A collaborative, global, and multi-cultural work environment fostering innovation and continuous learning.

Pet-friendly Pets are welcome at the premises.
Flexible hours Flexible schedule and freedom for attending family needs or personal errands.
Health coverage Lilo AI pays or copays health insurance for employees.
Company retreats Team-building activities outside the premises.
Dental insurance Lilo AI pays or copays dental insurance for employees.
Computer provided Lilo AI provides a computer for your work.
Informal dress code No dress code is enforced.
Vacation over legal Lilo AI gives you paid vacations over the legal minimum.
POSTULAR VÍA WEB Ver trabajo
$$$ Tiempo completo
Machine Learning Engineer
  • Haystack News
  • Lima (Hybrid)
Python SQL NoSQL Big Data

Haystack News is the leader in transitioning younger audiences to a next-generation TV news product. We empower users to watch the news they care about with a personalized headline news channel. Our personalization and big data technology enable a seamless lean-back experience reaching millions of users. In 2020, we streamed billions of minutes of news to users around the globe.

We're available on a variety of TV platforms (Roku, Fire TV, Android TV, Apple TV, Chromecast, Vizio) as well as mobile, tablet, and desktop.

Find this job and more on Get on Board.

Job functions

We are looking for an outstanding machine learning engineer to join Haystack and help improve the experience of millions of users that rely on Haystack to get the news every day.

Within the team you’ll find many opportunities to work on various aspects of Machine Learning and Data Engineering. The job offers the opportunity of generating a major impact on the product while working with an awesome and talented team, using the latest technologies.

You will:

  • Analyze large data sets to get insights using statistical analysis tools and techniques
  • Build, evaluate and deploy machine learning models
  • Maintain ongoing reliability, performance, and support of the data infrastructure, providing solutions based on application needs and anticipated growth.
  • Work with tools to configure, monitor and orchestrate data infrastructure and pipelines.
  • Run and monitor AB Tests
  • Build and manage APIs

Qualifications and requirements

  • Bachelor's degree in Computer Science, Statistics, Math or related field.
  • 3+ years experience writing software in a professional setting
  • Knowledge of AWS and Python
  • Strong Math/Stats background with statistical analysis experience on big data sets
  • Experience with SQL and NoSQL (e.g. MongoDb or DynamoDB)
  • Big Plus: Experience with data warehouses (e.g. Snowflake, Big Query, Redshift)
  • Exposure to ML/AI libraries such as sklearn, LightGBM and, XGBoost.
  • Travel Visa to the US (desired)

Conditions

Uber rides to come to the office!
Travel to team's offsite events
Learn about multiple technologies

Pet-friendly Pets are welcome at the premises.
Flexible hours Flexible schedule and freedom for attending family needs or personal errands.
Meals provided Haystack News provides free lunch and/or other kinds of meals.
Paid sick days Sick leave is compensated (limits might apply).
Partially remote You can work from your home some days a week.
Bicycle parking You can park your bicycle for free inside the premises.
Company retreats Team-building activities outside the premises.
Computer repairs Haystack News covers some computer repair expenses.
Commuting stipend Haystack News offers a stipend to cover some commuting costs.
Computer provided Haystack News provides a computer for your work.
Performance bonus Extra compensation is offered upon meeting performance goals.
Conference stipend Haystack News covers tickets and/or some expenses for conferences related to the position.
Informal dress code No dress code is enforced.
Vacation over legal Haystack News gives you paid vacations over the legal minimum.
Beverages and snacks Haystack News offers beverages and snacks for free consumption.
POSTULAR VÍA WEB Ver trabajo
Gross salary $4000 - 5500 Tiempo completo
Sr. Full-Stack Data Scientist
  • TECLA
PostgreSQL Django Machine Learning Data Science
TECLA is a fully remote software development company with a diverse and highly skilled team of 100+ experts across Latin America. We specialize in AI, cloud, automation, and DevOps, building innovative solutions for high-growth startups and established mid-market technology companies in the U.S.
Our work culture is agile, collaborative, and fast-paced—we take our work seriously while keeping the environment fun and engaging. As a remote-first company, we emphasize strong teamwork and close collaboration, working seamlessly with our U.S. partners as one integrated unit. Whether optimizing cloud environments, developing AI-driven applications, or enhancing DevOps automation, TECLA is at the forefront of technical innovation, solving complex challenges with cutting-edge technology.

Find this job on getonbrd.com.

Job Details:

We are growing to the next level of maturity, having found early success and market validation. We are seeking a stellar Full-Stack Data Scientist to lead the development of data-driven features and insights that will shape our core product and elevate customer decision-making. The selected candidate will work directly with senior leadership and will own the machine learning and data roadmap, from problem definition to production deployment. If you have world-class data science skills, thrive in ambiguity, move fast, and are excited about building technology that makes a difference, this role is for you.

What You’ll Do:

  • Design and implement the ML, AI and data science solutions that power customer-facing features.
  • Own the full ML lifecycle, from data expiration and modelling to deployment and monitoring.
  • Translate customer problems into scalable, data-driven solutions.
  • Contribute to our overall data and ML infrastructure, including tooling and deployment pipelines.
  • Collaborate cross-functionally to ensure insights are delivered effectively and reliably.
  • Help define our data strategy.

What You Bring:

  • BS or MS in Data Science, Machine Learning, Computer Science, or a related field.
  • 5+ years of proven track record delivering ML/AI-driven products in commercial environments.
  • Expertise in both supervised and unsupervised learning techniques, including model selection, tuning, and evaluation.
  • Strong skills across machine learning, statistical modeling, and data engineering.
  • Experience working with LLMs for summarization, generation, and agentic workflows.
  • Experience shipping models to production, including deployment, monitoring, and versioning.
  • Proven ability to work independently in a fast-paced, ambiguous, early-stage environment.
  • Strong communication and interpersonal skills.

Bonus Points For:
  • Familiarity with MLOps.
  • Comfort working across backend infrastructure (e.g., Django, APIs, PostgreSQL).
  • Prior experience in Fintech.

Benefits:

  • A fully remote position with a structured schedule that supports work-life balance.
  • The opportunity to make a huge impact by contributing to cutting-edge technology and product innovation.
  • Two weeks of paid vacation per year.
  • 10 paid days for local holidays.
*Please note we are only looking for full-time dedicated team members who are eager to fully integrate within our team.

Fully remote You can work from anywhere in the world.
POSTULAR VÍA WEB Ver trabajo
$$$ Tiempo completo
Data Engineer Inteligencia Artificial
  • Factor IT
Java Python SQL Web server
FactorIT es un líder en tecnología con presencia en 8 países y se dedica a ofrecer soluciones innovadoras en Data y Analytics, Transformación Digital e Inteligencia Artificial. Estamos buscando un profesional talentoso para unirse a nuestro equipo en el desarrollo de infraestructuras de datos específicas para proyectos de IA, lo que representa una gran oportunidad para participar en iniciativas desafiantes y significativas que tendrán un impacto en múltiples industrias.

Apply without intermediaries through Get on Board.

Responsabilidades:

En esta función, el candidato será responsable de:
  • Desarrollar y optimizar pipelines de datos para modelos de IA.
  • Colaborar con arquitectos de datos y científicos de datos para construir soluciones escalables y eficientes.
  • Gestionar la integración y almacenamiento de grandes volúmenes de datos utilizando diversas herramientas y plataformas de IA.
  • Mejorar la calidad de los datos garantizando su adecuada preparación para el entrenamiento de modelos de Machine Learning.
  • Implementar procesos automatizados para la recolección y el procesamiento de datos.

Requisitos:

Buscamos a alguien que cumpla con las siguientes calificaciones:
  • Experiencia mínima de 4 años en ingeniería de datos con un énfasis en Inteligencia Artificial.
  • Conocimientos técnicos en herramientas de procesamiento de datos como Apache Spark, Kafka, Hadoop, entre otros.
  • Experiencia trabajando con grandes volúmenes de datos y tecnologías de almacenamiento en la nube.
  • Dominio de lenguajes de programación como Python, SQL y otros orientados a la manipulación de datos.
  • Conocimiento práctico en la implementación de Machine Learning y Deep Learning.

Deseable:

Experiencia en entornos de trabajo ágil y capacidad para gestionar múltiples proyectos simultáneamente. La pasión por la innovación y el deseo de aprendizaje continuo son altamente valorados.

Ofrecemos:

En FactorIT, promovemos un entorno dinámico y de innovación tecnológica. Ofrecemos:
  • Capacitación continua en tecnologías emergentes, especialmente en IA.
  • Flexibilidad y opción de trabajo remoto, permitiendo un mejor balance entre la vida personal y laboral.
  • Oportunidades de crecimiento profesional en un ambiente inclusivo que valora la creatividad y el trabajo en equipo.
¡Si eres un experto en ingeniería de datos con pasión por la Inteligencia Artificial, te estamos esperando! 🚀

Fully remote You can work from anywhere in the world.
Computer provided Factor IT provides a computer for your work.
POSTULAR VÍA WEB Ver trabajo