Are you sure you want to delete this access key?
by Thom Ives, Ph.D.
This class is meant to help us practice ABV in our DS, ML, AI work - Always Be Visualizing.
Why not Manim? I had a suspicion that it would be easier, at least for me, to do this in Visual Python, which I have been using for over 20 years now. That said, which is best? I don't know yet. I feel they are both good and powerful, and I do want to do the same in Manim. I suspect that it might work out quite easily in Manim too, but I am not sure yet if it would be as flexible as this tool turned out to be. And this is only the beginning for this class. NOTE that there is only an __init__
function at this point. We will be able to add dynamic coloring and images representing neuron activations and more in the future. However, this is a great start.
import vpython as vp
import numpy as np
We want to establish colors in one place, so that we do not need to update multiple places when we want to change them.
neuron_color = vp.vector(0.00, 0.49, 0.73)
weight_color = vp.vector(0.80, 0.80, 0.80)
text_color = vp.vector(1.0, 1.0, 1.0)
We pass in an architecture, which is simply an array with the number of neurons in each layer as the elements. Thus, the length of the architecture array is the number of layers including input and output layers. We also pass in a title. We also initialize storage arrays for Visual Python objects that form the network. Finally, we establish some helpful values for spacing out the network objects.
class Network:
def __init__(self, architecture, title):
neuron_storage = []
weight_storage = []
net_amp = (len(architecture) - 1) / 2
max_amp = (max(architecture) - 1) / 2
We also establish some helpful values for spacing out the textS objects.
x_text = net_amp
y_text = max_amp + 0.5
Create the title text.
vp.text(text=title, pos=vp.vector(0, y_text, 0),
align="center", color=text_color, height=0.2)
Create helpful text below the network.
# Descriptive Text Below the Network
vp.text(text="inputs", pos=vp.vector(-x_text, -y_text, 0),
align="right", color=text_color, height=0.2)
vp.text(text="|", pos=vp.vector(-x_text + 0.5, -y_text, 0),
align="center", color=text_color, height=0.2)
vp.text(text="hidden layers", pos=vp.vector(0, -y_text, 0),
align="center", color=text_color, height=0.2)
vp.text(text="|", pos=vp.vector(x_text - 0.5, -y_text, 0),
align="center", color=text_color, height=0.2)
vp.text(text="outputs", pos=vp.vector(x_text, -y_text, 0),
align="left", color=text_color, height=0.2)
Fill out the space needed for neuron storage.
# Initialize the Neuron Storage
for n in architecture:
layer = [0 for i in range(n)]
neuron_storage.append(layer)
Now, create neurons in the correct locations for each layer.
# Create and Store the Neuron Spheres
for layer_num, num_neurons in enumerate(architecture):
layer_amp = (num_neurons - 1) / 2
for neuron_num in range(num_neurons):
neuron_pos = vp.vector(
layer_num - net_amp, neuron_num - layer_amp, 0)
neuron_storage[layer_num][neuron_num] = vp.sphere(
pos=neuron_pos, radius=0.1, color=neuron_color)
Now create the weight bond objects. These are actually just very long thin cylinders.
# Create and Store the Weight Bonds
for layer_num, current_n in enumerate(architecture[:-1]):
next_n = architecture[layer_num+1]
weight_storage.append([])
for i in range(current_n):
weight_storage[-1].append([])
x_start = layer_num - net_amp
x_end = x_start + 1
for j in range(next_n):
y_start = i - (current_n - 1) / 2
y_end = j - (next_n - 1) / 2
start = vp.vector(x_start, y_start, 0)
end = vp.vector(x_end, y_end, 0)
weight_storage[-1][-1].append(vp.cylinder(
pos=start, axis=end-start, radius=0.01,
color=weight_color))
Finally, we make our storage objects of visual neuron and weight objects into class attributes for later usage.
# Make Neuron and Weight Bond Storage Class Attributes
self.neuron_storage = neuron_storage
self.weight_storage = weight_storage
The instantiation of this object builds a static 3D neural network object. Why does 3D matter? It doesn't ... yet. I intend to extend this work to transformers and beyond. I believe 3D will end up being very helpful for that work.
arch = [2, 3, 4, 3, 2]
network = Network(arch, "Neural Network")
NOTE that we could import this Network class from this module into a separate file, and then instantiate it in the separate file. I just did it here for convenience.
This work will be ongoing. It started with my PyTorch Journey, but I am sure it will be reused on many other projects.
Press p or to see the previous file or, n or to see the next file
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?