Let’s consider a business deal where there are multiple parties negotiating the terms. In such a situation, it’s usually not possible for every single party to get everything it wants. They need to optimize their demands so that everyone comes out with something positive. Similar situations arise across many areas of engineering where we have to deal with many resources and we need to make a trade off based on cost, quality, speed, and so on. How do we model this problem and decide the optimal state of affairs? This is where the concept of Pareto Optimality comes into picture. Continue reading

# What Is Long Memory In Time Series Analysis

We encounter time series data very frequently in the real world. Some common examples include real time sensors, surveillance video, stock market, astrophysics, speech recognition, and so on. In order to study time series data, we try to extract various characteristics that tend to define it. One of the most important things to think about is the dependence between various points in the time series data. Is there any dependence between the values in the time series data? If so, how far apart in time do they have to be in order to affect each other? Understanding these aspects will open up new doors in terms of how we analyze the data. This is where the concept of long memory comes into picture. Let’s dig a little deeper and understand it, shall we? Continue reading

# What Is Monte Carlo Simulation

There are many phenomena in everyday life where it’s very difficult to model the problem. There are so many variables and so many dependencies that any approximation or assumption would lead to a huge errors in outputs. This is usually a combination of uncertainty and variability. Even though we have access to all the historical information, we can’t accurately predict a future outcome because of inaccurate modeling. This becomes especially relevant when we are dealing with systems where the degrees of freedom are dependent on each other. An example would be movement of fluids or kinetic modeling of gases. How do we compute the possible outcomes? How can we assess the impact of all the free variables to make sure we predict the outcome under uncertainty? Continue reading

# Undestanding IoT Gateways

The Internet of Things (IoT) ecosystem is rapidly expanding. Some analysts predict that there will be around 50 billion connected devices by 2020. If you are new to IoT, it refers to the collective ecosystem of devices that are connected to the internet. These devices can be sensors, actuators, health monitors, meters, and so on. What did people do before IoT? Well, they had devices that weren’t connected to the internet. Hence it was difficult to monitor and analyze data in real time. This meant that people were leaving a lot of interesting data unused, which directly translates to lost revenue of billions of dollars. By connecting all the devices to the internet, we are enabling ourselves to take actions in real time. It’s obvious that device connectivity is a really important aspect in IoT. How do we ensure connectivity? How can we enable low cost hardware devices to communicate with the cloud without expensive processors? Continue reading

# Deep Learning For Smart Cities

In recent years, technological advancements in hardware, software, and embedded systems are enabling billions of smart devices to be connected to the internet. This ecosystem is collectively referred to as Internet of Things. A lot of people are actively migrating to cities, which means the essential resources are going to get scarcer. Cities will have to manage infrastructure like water, power, transport, and so on very effectively if they want to support everybody. But how do we do that? The data that is being collected varies so much quality and format that it becomes very difficult to use it effectively. How can we effectively use the data being collected by connected sensors? Continue reading

# Estimating The Predictability Of Time Series Data – Part II

In the previous blog post, we discussed various types of time series data. We understood the concepts of stationarity and shocks. In this blog post, we will continue to discuss how we can estimate the predictability of time series data. People say that future is unpredictable. But that’s grossly reductive! What they actually mean to say is — I’m blindly assuming that my time series data is non-stationary, so I cannot accurately predict what’s going to happen in the future. Predicting future values can open a lot of doors in the Internet of Things (IoT) ecosystem. Before we can forecast future values, it’s important to determine if the time series data exhibits any properties that can be modeled. If not, we are just dealing with chaos and no model will be good enough. But a lot of data in the real world exhibits patterns, so we just need to look at it the right way. Let’s see how we can check if the given time series data has any underlying trends, shall we? Continue reading

# Estimating The Predictability Of Time Series Data – Part I

Time series data refers to a sequence of measurements made over time. The frequency of these measurements are usually fixed, say once every second or once every hour. We encounter time series data in a variety of scenarios in the real world. Some examples include stock market data, sensor data, speech data, and so on. People like to build forecasting models for time series data. This is very relevant in modeling data in the world of Internet of Things (IoT). Based on the past data, they want to predict what’s going to happen in the future. Once of the most important questions is to see whether or not we can predict something in the first place. How do we determine that? How do we check if there are underlying patterns in the time series data? Continue reading