Skip to content Skip to sidebar Skip to footer

Time Series Classification for Fatigue Detection in Runners — A Tutorial | by K Bahavathy | Dec, 2023

A step-by-step walkthrough of inter-participant and intra-participant classification performed on wearable sensor data of runners Image by authorRunning data collected using wearable sensors can provide insights about a runner’s performance and overall technique. The data that comes from these sensors are usually time series by nature. This tutorial runs through a fatigue detection task where…

Read More

On Why Machines Can Think. How can we think about thinking in the… | by Niya Stoimenova | Dec, 2023

How can we think about thinking in the simplest way possible? Opening Pandora’s box (image by author)In the 17th century, René Descartes introduced a relatively new idea — the dictum “cogito ergo sum” (“I think, therefore I am”). This simple formulation served as the basis of Western philosophy and defined for centuries our ideas on…

Read More

Transitioning from ETL to ELT with AnalyticsEng

How cloud computing and analytics engineering forced the transition from ETL to ELT Image generated via DALL-EETL (Extract-Transform-Load) and ELT (Extract-Load-Transform) are two terms commonly used in the realm of Data Engineering and more specifically in the context of data ingestion and transformation. While these terms are often used interchangeably, they refer to slightly different…

Read More

A Guide on 12 Tuning Strategies for Production-Ready RAG Applications | by Leonie Monigatti | Dec, 2023

How to improve the performance of your Retrieval-Augmented Generation (RAG) pipeline with these “hyperparameters” and tuning strategies Tuning Strategies for Retrieval-Augmented Generation ApplicationsD ata Science is an experimental science. It starts with the “No Free Lunch Theorem,” which states that there is no one-size-fits-all algorithm that works best for every problem. And it results in…

Read More

LLMs for Everyone: Running LangChain and a MistralAI 7B Model in Google Colab | by Dmitrii Eliuseev | Dec, 2023

Experimenting with Large Language Models for free Artistic representation of the LangChain, Photo by Ruan Richard Rodrigues, UnsplashEverybody knows that large language models are, by definition, large. And even not so long ago, they were available only for high-end hardware owners, or at least for people who paid for cloud access or even every API…

Read More

LLM and GNN: How to Improve Reasoning of Both AI Systems on Graph Data | by Anthony Alcaraz | Dec, 2023

Graph neural networks (GNNs) and large language models (LLMs) have emerged as two major branches of artificial intelligence, achieving immense success in learning from graph-structured and natural language data respectively. As graph-structured and natural language data become increasingly interconnected in real-world applications, there is a growing need for artificial intelligence systems that can perform multi-modal…

Read More