Calculating Potential HNT Earnings with Wikipedia, Python, and the Helium Coverage Map

Calculating Potential HNT Earnings with Wikipedia, Python, and the Helium Coverage Map

By Passivebot | Passivebot | 22 Apr 2021


South-Eastern Washington’s Near Non-Existent Helium Presence

South-Eastern Washington has yet to start operating a proper Helium network.

In my last article, we covered the Internet of Things and the technologies present in the IoT sector. The Helium network is a network of Hotspots that transmit and receive radio signals for IoT-connected devices. Hotspots record all transactions on a blockchain and reward owners for providing coverage with HNT, a cryptocurrency token.

In general, the more signals a given hotspot receives, the more HNT it earns. Single Hotspots earn the least as they can only issue Challenges over the internet, and can’t participate in Proof-of-Coverage.

Helium Hotspots serve as both the backbone of the network by providing many square miles of connectivity (depending on environment), and as nodes for the Helium blockchain which powers the reward system.

Hotspot Distance Management

Hotspots are designed in a way that the token earnings diminish if the Helium hotspots are too close to one another. Based on initial testing, only about 50 to 150 hotspots are needed to provide complete coverage for an entire city.

Representation of Helium nodes in an urban environment.

A good rule of thumb is to provide a minimum distance of 300 to 500 meters between Hotspots though this could vary depending on the environment (in a dense city closer, and in a more rural environment farther).

Hotspots are manufactured using commodity open-standards components with no proprietary hardware. Each Hotspot can support thousands of connected devices, and provide coverage over many square miles. If you’re interested in getting into mining Helium(HNT), please consider using my Nebra affiliate link.

Calculating Potential HNT Earnings

  1. Find the total population and total area of your city.
  2. Match your cities population with the y-axis of the graph shown below to see find out how many cities fall in your cities population size bracket.
  3. Narrow down your sample size by finding all the cities in our population size bracket.
  4. Then narrow down the results further by total area. (n = 13)
  5. Gather and extract data from the Helium coverage map of cities in your sample size.
  6. Use extracted data to make inferences on how much HNT a Helium network would mine in your city.

Population Statistics

How many incorporated places are registered in the U.S.?

There were 19,502 incorporated places registered in the United States as of July 31, 2019. 16,410 had a population under 10,000 while, in contrast, only 10 cities had a population of one million or more.

Graph A: Number of U.S. cities, towns, villages by population size 2019

Total Population and Area of Tri-Cities, Washington

Tri-Cities, Washington has a total population of 283,830. This puts us towards the top of the population size(y-axis) of the graph shown above. There are 52 other cities in the United States in the 250,000–499,999 population size bracket.

  Total population of Tri-Cities, WA

Number of U.S. cities, towns, villages by population size 2019

Scraped from: https://en.wikipedia.org/wiki/List_of_United_States_cities_by_population

Once we have U.S. cities' population size data, we can further narrow down our sample size by finding all 52 cities in our population size bracket. Then we’ll narrow down the results further by total area. For sake of simplicity, our end sample size is going to be 13 cities.

  Total Area Tri-Cities, WA

End sample size

Here is the list of our end sample size. The total size is now 13 U.S. Cities which are within the population size bracket of 250,000–499,999 and are as close to the 109 mi² total area of Tri-Cities, WA as possible.

 

If you’re more interested in the blockchain aspects of Helium and not Python programming, you can skip ahead to the next article. I’ve already included all the relevant end-data in the two Airtables up above.

The rest of the article is a technical explanation of how I used Python to extract the above population data from Wikipedia using Pandas.

Scraping population estimates of US Cities from Wikipedia using Pandas

Pre-requisites for running Pandas with Python

The easiest way to install Pandas on a Python environment is through the installer pip.

pip install pandas

How to scrap data from Wikipedia using Pandas with Python?

Once you have completed the pre-requisites section, you’re ready to start scraping from HTML data tables with Python.

Step 1: First import the pandas module as pd (line 2). Pandas is one the most powerful and flexible open-source data analysis/manipulation tools available in any language.

# Required libraries
import pandas as pd

Pandas is well suited for munging and cleaning data, analyzing/modeling it, and then organizing the results of the analysis into a form suitable for plotting or tabular display.

Step 2: When we use pandas’ read.html() method, we’ll get a list of all HTML tables present on the web page of the URL provided (line 2). We’re able to view these tables using the print() method (line 4).

# Read html tables from Wikipedia
df = pd.read_html('https://en.wikipedia.org/wiki/List_of_United_States_cities_by_population')
# Display HTML table on Wikipedia page
print(df) Data wrangling is a multi-step process.

Amongst all this data, we’re only looking for data off of a specific table containing 314 rows x 11 columns.

df[4]

Step 3: Now that we know which table we’re looking for, we filter which labels of data we need from df[4], in our case it’s the city name, 2019 population estimate, and 2016 land area (line 8). Lastly, we save the filtered data onto a CSV file (line 16).

 

# Required libraries
import pandas as pd

# Read html tables from Wikipedia
df = pd.read_html('https://en.wikipedia.org/wiki/List_of_United_States_cities_by_population')

# Select 4th table on page and filter labels of index
df = (df[4].filter(items=['City', '2019estimate','2016 land area']))

# Sort values of data frame by order of lowest to highest 2019estimate
df = df.sort_values('2019estimate')

# print(df)

# Save data to csv
df.to_csv('cities.csv')

 

We’re almost done but there’s a small issue.

Terminal output

The data we saved on the CSV is actually separated into four columns instead of three.

Index, City, 2019estimates, 2016 land area.

 

Step 4: To filter out only necessary data to input into the Airtable, we’re going to want to open the CSV file containing population data (line 2) and insert a reader so that we’re able to retrieve information from the CSV file (line 4).


# Save data to csv
df.to_csv('cities.csv')
# Open csv containing population data
f = open('cities.csv')
# Return a reader object which iterates over lines of csv file
csv_f = csv.reader(f)
# Select row[1] for city names, row[2] for 2019 population estimate
for row in csv_f:
df = row[3]
# Use replace method to remove extra charcters
df = df.replace(' sq mi', ' ')
# Display sorted data
print(df)

Using csv.reader() method, we’re able to iterate through rows of specific columns in Pandas data frames (lines 6, 7).

Step 5: Finally, we want to display the sorted data in each separate column by using the print() method (line 9). This way we copying only required data values from the columns of our data frame.

2019 Population estimates. Final Thoughts

In this article, we briefly covered the distance management aspect of Helium hotspots. Then we discussed calculating potential HNT earnings by comparing HNT earnings in similar-sized cities across the United States. Finally, we covered population statistics and scraped population estimates of US Cities from Wikipedia using Pandas.

Up ahead, we’re going to be using our scraped data from population estimates of US cities to make an educated conclusion on how much HNT a Helium Hotspot in Tri-Cities, WA would earn per month.

Helium coverage map over Tampa, Florida.  

Don’t have a Bitcoin wallet yet? Start storing, trading, and earning cryptocurrencies with CoinBase.

For more tips and updates, follow me on PassivebotTwitterFacebook, and GitHub.

Bitcoin Cash donation: qqdxn2rk4ze9q2zp7vmcrlaw9tdqnlf0eu8mumkz3l

Bitcoin donation: 39n7ovER7BUb7qP7rwH8EsZ2gx99JN7mYV

This article is for informational purposes only. It should not be considered Financial or Legal Advice. Contained information may not be accurate at all times. Consult a financial professional before making any major financial decisions

 

How do you rate this article?


3

0

Passivebot
Passivebot

Earn crypto, gift cards, and virtual currencies, passively.


Passivebot
Passivebot

Earn crypto, gift cards, and virtual currencies, passively.

Send a $0.01 microtip in crypto to the author, and earn yourself as you read!

20% to author / 80% to me.
We pay the tips from our rewards pool.