Snake_Byte:[14] Coding In Philosophical Frameworks

Dalle-E Generated Philospher

Your vision will only become clear when you can look into your heart. Who looks outside, dreams; who looks inside, awakes. Knowing your own darkness is the best method for dealing with the darknesses of other people. We cannot change anything until we accept it.

~ C. Jung

(Caveat Emptor: This blog is rather long in the snakes tooth and actually more like a CHOMP instead of a BYTE. tl;dr)

First, Oh Dear Reader i trust everyone is safe, Second sure feels like we are living in an age of Deus Ex Machina, doesn’t it? Third with this in mind i wanted to write a Snake_Byte that have been “thoughting” about for quite some but never really knew how to approach it if truth be told. I cant take full credit for this ideation nor do i actually want to claim any ideation. Jay Sales and i were talking a long time after i believe i gave a presentation on creating Belief Systems using BeliefNetworks or some such nonsense.

The net of the discussion was we both believed that in the future we will code in philosophical frameworks.

Maybe we are here?

So how would one go about coding an agent-based distributed system that allowed one to create an agent or a piece of evolutionary code to exhibit said behaviors of a philosophical framework?

Well we must first attempt to define a philosophy and ensconce it into a quantized explanation.

Stoicism seemed to me at least the best first mover here as it appeared to be the tersest by definition.

So first those not familiar with said philosophy, Marcus Aurelius was probably the most famous practitioner of Stoicism. i have put some references that i have read at the end of this blog1.

Stoicism is a philosophical school that emphasizes rationality, self-control, and inner peace in the face of adversity. In thinking about this i figure To build an agent-based software system that embodies Stoicism, we would need to consider several key aspects of this philosophy.

  • Stoics believe in living in accordance with nature and the natural order of things. This could be represented in an agent-based system through a set of rules or constraints that guide the behavior of the agents, encouraging them to act in a way that is in harmony with their environment and circumstances.
  • Stoics believe in the importance of self-control and emotional regulation. This could be represented in an agent-based system through the use of decision-making algorithms that take into account the agent’s emotional state and prioritize rational, level-headed responses to stimuli.
  • Stoics believe in the concept of the “inner citadel,” or the idea that the mind is the only thing we truly have control over. This could be represented in an agent-based system through a focus on internal states and self-reflection, encouraging agents to take responsibility for their own thoughts and feelings and strive to cultivate a sense of inner calm and balance.
  • Stoics believe in the importance of living a virtuous life and acting with moral purpose. This could be represented in an agent-based system through the use of reward structures and incentives that encourage agents to act in accordance with Stoic values such as courage, wisdom, and justice.

So given a definition of Stoicism we then need to create a quantized model or discrete model of those behaviors that encompass a “Stoic Individual”. i figured we could use the evolutionary library called DEAP (Distributed Evolutionary Algorithms in Python ). DEAP contains both genetic algorithms and genetic programs utilities as well as evolutionary strategy methods for this type of programming.

Genetic algorithms and genetic programming are both techniques used in artificial intelligence and optimization, but they have some key differences.

This is important as people confuse the two.

Genetic algorithms are a type of optimization algorithm that use principles of natural selection to find the best solution to a problem. In a genetic algorithm, a population of potential solutions is generated and then evaluated based on their fitness. The fittest solutions are then selected for reproduction, and their genetic information is combined to create new offspring solutions. This process of selection and reproduction continues until a satisfactory solution is found.

On the other hand, genetic programming is a form of machine learning that involves the use of genetic algorithms to automatically create computer programs. Instead of searching for a single solution to a problem, genetic programming evolves a population of computer programs, which are represented as strings of code. The programs are evaluated based on their ability to solve a specific task, and the most successful programs are selected for reproduction, combining their genetic material to create new programs. This process continues until a program is evolved that solves the problem to a satisfactory level.

So the key difference between genetic algorithms and genetic programming is that genetic algorithms search for a solution to a specific problem, while genetic programming searches for a computer program that can solve the problem. Genetic programming is therefore a more general approach, as it can be used to solve a wide range of problems, but it can also be more computationally intensive due to the complexity of evolving computer programs2.

So returning back to the main() function as it were, we need create a genetic program that models Stoic behavior using the DEAP library,

First need to define the problem and the relevant fitness function. This is where the quantized part comes into play. Since Stoic behavior involves a combination of rationality, self-control, and moral purpose, we could define a fitness function that measures an individual’s ability to balance these traits and act in accordance with Stoic values.

So lets get to the code.

To create a genetic program that models Stoic behavior using the DEAP library in a Jupyter Notebook, we first need to install the DEAP library. We can do this by running the following command in a code cell:

pip install deap

Next, we can import the necessary modules and functions:

import random
import operator
import numpy as np
from deap import algorithms, base, creator, tools

We can then define the problem and the relevant fitness function. Since Stoic behavior involves a combination of rationality, self-control, and moral purpose, we could define a fitness function that measures an individual’s ability to balance these traits and act in accordance with Stoic values.

Here’s an example of how we might define a “fitness function” for this problem:

# Define the fitness function.  NOTE: # i am open to other ways of defining this and other models
# the definition of what is a behavior needs to be quantized or discretized and 
# trying to do that yields a lossy functions most times.  Its also self referential

def fitness_function(individual):
    # Calculate the fitness based on how closely the individual's behavior matches stoic principles
    fitness = 0
    # Add points for self-control, rationality, focus, resilience, and adaptability can haz Stoic?
    fitness += individual[0]  # self-control
    fitness += individual[1]  # rationality
    fitness += individual[2]  # focus
    fitness += individual[3]  # resilience
    fitness += individual[4]  # adaptability
    return fitness,

# Define the genetic programming problem
creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)

# Initialize the genetic algorithm toolbox
toolbox = base.Toolbox()

# Define the genetic operators
toolbox.register("attribute", random.uniform, 0, 1)
toolbox.register("individual", tools.initRepeat, creator.Individual, toolbox.attribute, n=5)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)
toolbox.register("evaluate", fitness_function)
toolbox.register("mate", tools.cxTwoPoint)
toolbox.register("mutate", tools.mutGaussian, mu=0, sigma=0.1, indpb=0.1)
toolbox.register("select", tools.selTournament, tournsize=3)

# Run the genetic algorithm
population = toolbox.population(n=10)
for generation in range(20):
    offspring = algorithms.varAnd(population, toolbox, cxpb=0.5, mutpb=0.1)
    fits = toolbox.map(toolbox.evaluate, offspring)
    for fit, ind in zip(fits, offspring):
        ind.fitness.values = fit
    population = toolbox.select(offspring, k=len(population))
    
# Print the best individual found
best_individual = tools.selBest(population, k=1)[0]

print ("Best Individual:", best_individual)
 

Here, we define the genetic programming parameters (i.e., the traits that we’re optimizing for) using the toolbox.register function. We also define the evaluation function (stoic_fitness), genetic operators (mate and mutate), and selection operator (select) using DEAP’s built-in functions.

We then define the fitness function that the genetic algorithm will optimize. This function takes an “individual” (represented as a list of five attributes) as input, and calculates the fitness based on how closely the individual’s behavior matches stoic principles.

We then define the genetic programming problem via the quantized attributes, and initialize the genetic algorithm toolbox with the necessary genetic operators.

Finally, we run the genetic algorithm for 20 generations, and print the best individual found. The selBest function is used to select the top individual fitness agent or a “behavior” if you will for that generation based on the iterations or epochs. This individual represents an agent that mimics the philosophy of stoicism in software, with behavior that is self-controlled, rational, focused, resilient, and adaptable.

Best Individual: [0.8150247518866958, 0.9678037028949047, 0.8844195735244268, 0.3970642186025506, 1.2091810770505023]

This denotes the best individual with those best balanced attributes or in this case the Most Stoic,

As i noted this is a first attempt at this problem i think there is a better way with a full GP solution as well as a tunable fitness function. In a larger distributed system you would then use this agent as a framework amongst other agents you would define.

i at least got this out of my head.

until then,

#iwishyouwater <- Alexey Molchanov and Dan Bilzerian at Deep Dive Dubai

Muzak To Blog By: Phil Lynott “The Philip Lynott Album”, if you dont know who this is there is a statue in Ireland of him that i walked a long way with my co-founder, Lisa Maki a long time ago to pay homage to the great Irish singer of the amazing band Thin Lizzy. Alas they took Phil to be cleaned that day. At least we got to walk and talk and i’ll never forget that day. This is one of his solo efforts and i believe he is one of the best artists of all time. The first track is deeply emotional.

References:

[1] A list of books on Stoicism -> click HERE.

[2] Genetic Programming (On the Programming of Computers by Means of Natural Selection), By Professor John R. Koza. There are multiple volumes i think four and i have all of this but this is a great place to start and the DEAP documentation. Just optimizing a transcendental functions is mind blowing what GP comes out with using arithmetic

Snake_Byte:[13] The Describe Function.

DALLE-2 Draws Describe

First i trust everyone is safe. Second i hope people are recovering somewhat from the SVB situation. We are at the end of a era, cycle or epoch; take your pick. Third i felt like picking a Python function that was simple in nature but very helpful.

The function is pandas.describe(). i’ve previously written about other introspection libraries like DABL however this is rather simple and in place. Actually i never had utilized it before. i was working on some other code as a hobby in the areas of transfer learning and was playing around with some data and decided to to use the breast cancer data form the sklearn library which is much like the iris data used for canonical modeling and comparison. Most machine learning is data cleansing and feature selection so lets start with something we know.

Breast cancer is the second most common cancer in women worldwide, with an estimated 2.3 million new cases in 2020. Early detection is key to improving survival rates, and machine learning algorithms can aid in diagnosing and treating breast cancer. In this blog, we will explore how to load and analyze the breast cancer dataset using the scikit-learn library in Python.

The breast cancer dataset is included in scikit-learn's datasets module, which contains a variety of well-known datasets for machine learning. The features describe the characteristics of the cell nuclei present in the image. We can load the dataset using the load_breast_cancer function, which returns a dictionary-like object containing the data and metadata about the dataset.

It has been surmised that machine learning is mostly data exploration and data cleaning.

from sklearn.datasets import load_breast_cancer
import pandas as pd

#Load the breast cancer dataset
data = load_breast_cancer()

The data object returned by load_breast_cancer contains the feature data and the target variable. The feature data contains measurements of 30 different features, such as radius, texture, and symmetry, extracted from digitized images of fine needle aspirate (FNA) of breast mass. The target variable is binary, with a value of 0 indicating a benign tumor and a value of 1 indicating a malignant tumor.

We can convert the feature data and target variable into a pandas dataframe using the DataFrame constructor from the pandas library. We also add a column to the dataframe containing the target variable.

#Convert the data to a pandas dataframe
df = pd.DataFrame(data.data, columns=data.feature_names)
df['target'] = pd.Series(data.target)

Finally, we can use the describe method of the pandas dataframe to get a summary of the dataset. The describe method returns a table containing the count, mean, standard deviation, minimum, and maximum values for each feature, as well as the count, mean, standard deviation, minimum, and maximum values for the target variable.

#Use the describe() method to get a summary of the dataset
print(df.describe())

The output of the describe method is as follows:

mean radius  mean texture  ...  worst symmetry      target
count   569.000000    569.000000  ...      569.000000  569.000000
mean     14.127292     19.289649  ...        0.290076    0.627417
std       3.524049      4.301036  ...        0.061867    0.483918
min       6.981000      9.710000  ...        0.156500    0.000000
25%      11.700000     16.170000  ...        0.250400    0.000000
50%      13.370000     18.840000  ...        0.282200    1.000000
75%      15.780000     21.800000  ...        0.317900    1.000000
max      28.110000     39.280000  ...        0.663800    1.000000

[8 rows x 31 columns]

From the summary statistics, we can see that the mean values of the features vary widely, with the mean radius ranging from 6.981 to 28.11 and the mean texture ranging from 9.71 to 39.28. We can also see that the target variable is roughly balanced, with 62.7% of the tumors being malignant.

Pretty nice utility.

Then again in looking at this data one would think we could get to first principles engineering and root causes and make it go away? This directly affects motherhood which i still believe is the hardest job in humanity. Makes you wonder where all the money goes?

Until then,

#iwishyouwater <- Free Diver Steph who is also a mom hunting pelagics on #onebreath

Muzak To Blog By Peter Gabriel’s “Peter Gabriels 3: Melt (remastered). He is coming out with a new album. Games Without Frontiers and Intruder are timeless. i applied long ago to work at Real World Studios and received the nicest rejection letter.

Execution Is Everything

bulb 2 warez

Even if we crash and burn and loose everthing the experience is worth ten times the cost.

~ S. Jobs

As always, Oh Dear Readers, i trust this finds you safe. Second, to those affected by the SVB situation – Godspeed.

Third, i was inspired to write a blog on “Doing versus Thinking,” and then i decided on the title “Execution Is Everything”. This statement happens to be located at the top of my LinkedIn Profile.

The impetus for this blog came from a recent conversation where an executive who told me, “I made the fundamental mistake of falling in love with the idea and quickly realized that ideas are cheap, it is the team that matters.”

i’ve written about the very issue on several occasions. In Three T’s of a Startup to Elite Computing, i have explicitly stated ideas are cheap, a dime a dozen. Tim Ferris, in the amazing book “Tools Of Titans,” interviews James Altuchur, and he does this exercise every day:

This is taken directly from the book in his words, but condensed for space, here are some examples of the types of lists James makes:

  • 10 olds ideas I can make new
  • 10 ridiculous things I would invent (e.g., the smart toilet)
  • 10 books I can write (The Choose Yourself Guide to an Alternative Education, etc).
  • 10 business ideas for Google/Amazon/Twitter/etc.
  • 10 people I can send ideas to
  • 10 podcast ideas or videos I can shoot (e.g., Lunch with James, a video podcast where I just have lunch with people over Skype and we chat)
  • 10 industries where I can remove the middleman
  • 10 things I disagree with that everyone else assumes is religion (college, home ownership, voting, doctors, etc.)
  • 10 ways to take old posts of mine and make books out of them
  • 10 people I want to be friends with (then figure out the first step to contact them)
  • 10 things I learned yesterday
  • 10 things I can do differently today
  • 10 ways I can save time
  • 10 things I learned from X, where X is someone I’ve recently spoken with or read a book by or about. I’ve written posts on this about the Beatles, Mick Jagger, Steve Jobs, Charles Bukowski, the Dalaï Lama, Superman, Freakonomics, etc.
  • 10 things I’m interested in getting better at (and then 10 ways I can get better at each one)
  • 10 things I was interested in as a kid that might be fun to explore now (like, maybe I can write that “Son of Dr. Strange” comic I’ve always been planning. And now I need 10 plot ideas.)
  • 10 ways I might try to solve a problem I have. This has saved me with the IRS countless times. Unfortunately, the Department is Motor Vehicles is impervious to my superpowers

Is your brain tired of just “thinking” about doing those gymnastics?

i cannot tell you how many people have come to me and said “hey I have an idea!” Great, so do you and countless others. What is your plan of making it a reality? What is your maniacal passion every day to get this thing off the ground and make money?

The statement “Oh I/We thought about that 3 years ago” is not a qualifier for anything except that fact you thought it and didn’t execute on said idea.  You know why?

Creating software from an idea that runs 24/7 is still rather difficult. In fact VERY DIFFICULT.

“Oh We THOUGHT about that <insert number of days or years ago here>. i call the above commentary “THOUGHTING”. Somehow the THOUGHT is manifested from Ideas2Bank? If that is a process, i’d love to see the burndown chart on that one. No Oh Dear Readers, THOUGHTING is about as useful as that overly complex PowerPoint that gets edited ad nauseam, and people confuse the “slideware” with “software”. The only code that matters is this:

Code that is written with the smallest OPEX and Highest Margins thereby increasing Revenue Per Employee unless you choose to put it in open source for a wonderful plethora of reasons or you are providing a philanthropic service.

When it comes to creating software, “Execution is everything.” gets tossed around just like the phrase “It Just Works” as a requirement. At its core, this phrase means that the ability to bring an idea to life through effective implementation is what separates successful software from failed experiments.

The dynamic range between average and the best is 2:1. In software it is 50:1 maybe 100:1 very few things in life are like this. I’ve built a lot of my sucess on finding these truly gifted people.

~ S. Jobs

In order to understand why execution is so critical in software development, it’s helpful first to consider what we mean by “execution.” Simply put, execution refers to the process of taking an idea or concept and turning it into a functional, usable product. This involves everything from coding to testing, debugging to deployment, and ongoing maintenance and improvement.

When we say that execution is everything in software development, what we’re really saying is that the idea behind a piece of software is only as good as the ability of its creators to make it work in the real world. No matter how innovative or promising an idea may seem on paper, it’s ultimately worthless if it can’t be brought to life in a way that users find valuable and useful.

You can fail at something you dislike just as easily as something you like so why not choose what you like?

~ J. Carey

This is where execution comes in. In order to turn an idea into a successful software product, developers need to be able to navigate a complex web of technical challenges, creative problem-solving, and user feedback. They need to be able to write code that is clean, efficient, and scalable. They need to be able to test that code thoroughly, both before and after deployment. And they need to be able to iterate quickly and respond to user feedback in order to improve and refine the product continually.

The important thing is to dare to dream big, then take action to make it come true.

~ J. Girard

All of these factors require a high degree of skill, discipline, and attention to detail. They also require the ability to work well under pressure, collaborate effectively with other team members, and stay focused on the ultimate goal of creating a successful product.

The importance of execution is perhaps most evident when we consider the many examples of software projects that failed despite having what seemed like strong ideas behind them. From buggy, unreliable apps to complex software systems that never quite delivered on their promises, there are countless examples of software that fell short due to poor execution.

On the other hand, some of the most successful software products in history owe much of their success to strong execution. Whether we’re talking about the user-friendly interface of the iPhone or the robust functionality of Paypal’s Protocols, these products succeeded not just because of their innovative ideas but because of the skill and dedication of the teams behind them.

The only sin is mediocrity[1].

~ M. Graham

In the end, the lesson is clear: when it comes to software development, execution really is everything. No matter how brilliant your idea may be, it’s the ability to turn that idea into a functional, usable product that ultimately determines whether your software will succeed or fail. By focusing on the fundamentals of coding, testing, and iterating, developers can ensure that their software is executed to the highest possible standard, giving it the best chance of success in an ever-changing digital landscape.

So go take that idea and turn it into a Remarkable Viable Product, not a Minimum Viable Product! Who likes Minimum? (thanks R.D.)

Be Passionate! Go DO! Go Create!

Go Live Your Personal Legend!

A great video stitching of discussions from Steve Jobs on execution, and passion – click here-> The Major Thinkers Steve Jobs

Until then,

#iwishyouwater <- yours truly hitting around 31 meters (~100ft) on #onebreath

@tctjr

Muzak To Blog By: Todd Hannigan “Caldwell County.”

[1] The only sin is mediocrity is not true if there were a real Sin it should be Stupidity but the quote fits well in the narrative.

Snake_Byte[12]: Dabl A High-Level Data Analysis Library in Python

Not To Be Confused With The Game

It enables us to dabble in vicarious vice and to sit in smug judgment on the result.

Online Quote Generator

First, i hope everyone is safe. Second i haven’t written a Snake_Byte [ ] in quite some time so here goes. This is a library i ran across late last night and well for what it achieves even for data exploration it is well worth the pip install dabl cost of it all.

Data analysis is an essential task in the field of machine learning and artificial intelligence. However, it can be a challenging and time-consuming task, especially for those who are not familiar with programming. That’s where the dabl library comes into play.

dabl, short for Data Analysis Baseline Library, is a high-level data analysis library in python, designed to make data analysis as easy and effortless as possible. It is an open-source library, developed and maintained by the scikit-learn community.

The library provides a collection of simple and intuitive functions for exploring, cleaning, transforming, and visualizing data. With dabl, users can perform various data analysis tasks such as regression, classification, clustering, anomaly detection, and more, with just a few lines of code.

One of the main benefits of dabl is that it helps users get started quickly by providing a set of default actions for each task. For example, to perform a regression analysis, users can simply call the “regression” function and pass in their data, and dabl will take care of the rest.

Another advantage of dabl is that it provides easy-to-understand visualizations of the results, allowing users to quickly understand the results of their analysis and make informed decisions based on the data. This is particularly useful for non-technical users who may not be familiar with complex mathematical models or graphs.

dabl also integrates well with other popular data analysis libraries such as pandas, numpy, and matplotlib, making it a convenient tool for those already familiar with these libraries.

So let us jump into the code shall we?

This code uses the dabl library to perform regression analysis on the Titanic dataset. The dataset is loaded using the pandas library and passed to the dabl.SimpleRegressor function for analysis. The fit method is used to fit the regression model to the data, and the score method is used to evaluate the performance of the model. Finally, the dabl.plot function is used to visualize the results of the regression analysis.

import dabl
import pandas as pd
import matplotlib.pyplot as plt

# Load the Titanic dataset from the disk
titanic = pd.read_csv(dabl.datasets.data_path("titanic.csv"))
#check shape columns etc
titanic.shape
titanic.head
#all that is good tons of stuff going on here but now let us ask dabl whats up:
titanic_clean = dabl.clean(titanic, verbose=1)

#a cool call to detect types
types = dabl.detect_types(titanic_clean)
print (types)
#lets do some eye candy
dabl.plot(titanic, 'survived')
#lets check the distribution
plt.show()
#let us try simple regression if it works it works
# Perform regression analysis
fc = dabl.SimpleClassifier(random_state=0)
X = titanic_clean.drop("survived", axis=1)
y = titanic_clean.survived
fc.fit(X, y)                     

Ok so lets break this down a little.

We load the data set: (make sure the target directory is the same)

# Load the Titanic dataset from the disk
titanic = pd.read_csv(dabl.datasets.data_path("titanic.csv"))

Of note we loaded this in to a pandas dataframe. Assuming we can use python and load a comma-separated values file lets now do some exploration:

#check shape columns etc
titanic.shape
titanic.head

You should see the following:

(1309, 14) 

Which is [1309 rows x 14 columns]

and then:

pclass  survived                                             name  \
0          1         1                    Allen, Miss. Elisabeth Walton   
1          1         1                   Allison, Master. Hudson Trevor   
2          1         0                     Allison, Miss. Helen Loraine   
3          1         0             Allison, Mr. Hudson Joshua Creighton   
4          1         0  Allison, Mrs. Hudson J C (Bessie Waldo Daniels)   
...      ...       ...                                              ...   
1304       3         0                             Zabour, Miss. Hileni   
1305       3         0                            Zabour, Miss. Thamine   
1306       3         0                        Zakarian, Mr. Mapriededer   
1307       3         0                              Zakarian, Mr. Ortin   
1308       3         0                               Zimmerman, Mr. Leo   

         sex     age  sibsp  parch  ticket      fare    cabin embarked boat  \
0     female      29      0      0   24160  211.3375       B5        S    2   
1       male  0.9167      1      2  113781    151.55  C22 C26        S   11   
2     female       2      1      2  113781    151.55  C22 C26        S    ?   
3       male      30      1      2  113781    151.55  C22 C26        S    ?   
4     female      25      1      2  113781    151.55  C22 C26        S    ?   
...      ...     ...    ...    ...     ...       ...      ...      ...  ...   
1304  female    14.5      1      0    2665   14.4542        ?        C    ?   
1305  female       ?      1      0    2665   14.4542        ?        C    ?   
1306    male    26.5      0      0    2656     7.225        ?        C    ?   
1307    male      27      0      0    2670     7.225        ?        C    ?   
1308    male      29      0      0  315082     7.875        ?        S    ?   

     body                        home.dest  
0       ?                     St Louis, MO  
1       ?  Montreal, PQ / Chesterville, ON  
2       ?  Montreal, PQ / Chesterville, ON  
3     135  Montreal, PQ / Chesterville, ON  
4       ?  Montreal, PQ / Chesterville, ON  
...   ...                              ...  
1304  328                                ?  
1305    ?                                ?  
1306  304                                ?  
1307    ?                                ?  
1308    ?                                ?  

Wow tons of stuff going on here and really this is cool data from an awful disaster. Ok let dabl exercise some muscle here and ask it to clean it up a bit:

titanic_clean = dabl.clean(titanic, verbose=1)
types = dabl.detect_types(titanic_clean)
print (types)

i set verbose = 1 in this case and dabl.detect_types() shows the types detected which i found helpful:

Detected feature types:
continuous      0
dirty_float     3
low_card_int    2
categorical     5
date            0
free_string     4
useless         0
dtype: int64

However look what dabl did for us;

                      continuous  dirty_float  low_card_int  categorical  \
pclass                     False        False         False         True   
survived                   False        False         False         True   
name                       False        False         False        False   
sex                        False        False         False         True   
sibsp                      False        False          True        False   
parch                      False        False          True        False   
ticket                     False        False         False        False   
cabin                      False        False         False        False   
embarked                   False        False         False         True   
boat                       False        False         False         True   
home.dest                  False        False         False        False   
age_?                      False        False         False         True   
age_dabl_continuous         True        False         False        False   
fare_?                     False        False         False        False   
fare_dabl_continuous        True        False         False        False   
body_?                     False        False         False         True   
body_dabl_continuous        True        False         False        False   

                       date  free_string  useless  
pclass                False        False    False  
survived              False        False    False  
name                  False         True    False  
sex                   False        False    False  
sibsp                 False        False    False  
parch                 False        False    False  
ticket                False         True    False  
cabin                 False         True    False  
embarked              False        False    False  
boat                  False        False    False  
home.dest             False         True    False  
age_?                 False        False    False  
age_dabl_continuous   False        False    False  
fare_?                False        False     True  
fare_dabl_continuous  False        False    False  
body_?                False        False    False  
body_dabl_continuous  False        False    False 
Target looks like classification
Linear Discriminant Analysis training set score: 0.578
 

Ah sweet! So data science, machine learning or data mining is 80% cleaning up the data. Take what you can get and go with it folks. dabl even informs us it appears the target method looks like a classification problem. As the name suggests, Classification means classifying the data on some grounds. It is a type of Supervised learning. In classification, the target column should be a Categorical column. If the target has only two categories like the one in the dataset above (Fit/Unfit), it’s called a Binary Classification Problem. When there are more than 2 categories, it’s a Multi-class Classification Problem. The “target” column is also called a “Class” in the Classification problem.

Now lets do some analysis. Yep we are just getting to some statistics. There are univariate and bivariate in this case.

Bivariate analysis is the simultaneous analysis of two variables. It explores the concept of the relationship between two variable whether there exists an association and the strength of this association or whether there are differences between two variables and the significance of these differences.

The main three types we will see here are:

  1. Categorical v/s Numerical 
  2. Numerical V/s Numerical
  3. Categorical V/s Categorical data

Also of note Linear Discriminant Analysis or LDA is a dimensionality reduction technique. It is used as a pre-processing step in machine learning. The goal of LDA is to project the features in higher dimensional space onto a lower-dimensional space in order to avoid the curse of dimensionality and also reduce resources and dimensional costs. The original technique was developed in the year 1936 by Ronald A. Fisher and was named Linear Discriminant or Fisher’s Discriminant Analysis. 

(NOTE there is another LDA (Latent Dirichlet Allocation which is used in Semantic Engineering that is quite different).

dabl.plot(titanic, 'survived')

In the following plots that auto-magically happens is continuous feature plots for discriminant analysis.

Continuous Feature PairPlots

In the plots you will also see PCA (Principle Component Analysis). PCA was invented in 1901 by Karl Pearson, as an analog of the principal axis theorem in mechanics; it was later independently developed and named by Harold Hotelling in the 1930s.  Depending on the field of application, it is also named the discrete Karhunen–Loève transform (KLT) in signal processing, the Hotelling transform in multivariate quality control, proper orthogonal decomposition (POD) in mechanical engineering. PCA is used extensively in many and my first usage of it was in 1993 for three-dimensional rendering of sound.

Discriminating PCA Directions

What is old is new again.

The main difference is that the Linear discriminant analysis is a supervised dimensionality reduction technique that also achieves classification of the data simultaneously. LDA focuses on finding a feature subspace that maximizes the separability between the groups. While Principal component analysis is an unsupervised Dimensionality reduction technique, it ignores the class label. PCA focuses on capturing the direction of maximum variation in the data set.

LDA

Both reduce the dimensionality of the dataset and make it more computationally resourceful. LDA and PCA both form a new set of components.

The last plot is categorical versus target.

So now lets try as dabl said a SimpleClassifier then fit the data to the line. (hey some machine learning!)

fc = dabl.SimpleClassifier(random_state=0)
X = titanic_clean.drop("survived", axis=1)
y = titanic_clean.survived
fc.fit(X, y) 

This should produce the following outputs with accuracy metrics:

Running DummyClassifier(random_state=0)
accuracy: 0.618 average_precision: 0.382 roc_auc: 0.500 recall_macro: 0.500 f1_macro: 0.382
=== new best DummyClassifier(random_state=0) (using recall_macro):
accuracy: 0.618 average_precision: 0.382 roc_auc: 0.500 recall_macro: 0.500 f1_macro: 0.382

Running GaussianNB()
accuracy: 0.970 average_precision: 0.975 roc_auc: 0.984 recall_macro: 0.964 f1_macro: 0.968
=== new best GaussianNB() (using recall_macro):
accuracy: 0.970 average_precision: 0.975 roc_auc: 0.984 recall_macro: 0.964 f1_macro: 0.968

Running MultinomialNB()
accuracy: 0.964 average_precision: 0.988 roc_auc: 0.990 recall_macro: 0.956 f1_macro: 0.961
Running DecisionTreeClassifier(class_weight='balanced', max_depth=1, random_state=0)
accuracy: 0.976 average_precision: 0.954 roc_auc: 0.971 recall_macro: 0.971 f1_macro: 0.974
=== new best DecisionTreeClassifier(class_weight='balanced', max_depth=1, random_state=0) (using recall_macro):
accuracy: 0.976 average_precision: 0.954 roc_auc: 0.971 recall_macro: 0.971 f1_macro: 0.974

Running DecisionTreeClassifier(class_weight='balanced', max_depth=5, random_state=0)
accuracy: 0.969 average_precision: 0.965 roc_auc: 0.983 recall_macro: 0.965 f1_macro: 0.967
Running DecisionTreeClassifier(class_weight='balanced', min_impurity_decrease=0.01,
                       random_state=0)
accuracy: 0.976 average_precision: 0.954 roc_auc: 0.971 recall_macro: 0.971 f1_macro: 0.974
Running LogisticRegression(C=0.1, class_weight='balanced', max_iter=1000,
                   random_state=0)
accuracy: 0.974 average_precision: 0.991 roc_auc: 0.993 recall_macro: 0.970 f1_macro: 0.972
Running LogisticRegression(C=1, class_weight='balanced', max_iter=1000, random_state=0)
accuracy: 0.975 average_precision: 0.991 roc_auc: 0.994 recall_macro: 0.971 f1_macro: 0.973

Best model:
DecisionTreeClassifier(class_weight='balanced', max_depth=1, random_state=0)
Best Scores:
accuracy: 0.976 average_precision: 0.954 roc_auc: 0.971 recall_macro: 0.971 f1_macro: 0.974

This actually calls the sklearn routines in aggregate. Looks like our old friend logistic regression works. keep it simple sam it ain’t gotta be complicated.

In conclusion, dabl is a highly recommended library for those looking to simplify their data analysis tasks. With its intuitive functions and visualizations, it provides a quick and easy way to perform data analysis, making it an ideal tool for both technical and non-technical user. Again, the real strength of dabl is in providing simple interfaces for data exploration. For more information:

dabl github. <- click here

Until Then,

#iwishyouwater <- hold your breath on a dive with my comrade at arms @corepaddleboards. great video and the clarity was astounding.

Muzak To Blog By: “Ballads For Two”, Chet Baker and Wolfgang Lackerschmid, trumpet meet vibraphone sparsity. The space between the note is where all of the action lives.

A Book Review – Scythe

Nice Robe

I am the blade that is swung by your hand,

Slicing a rainbow’s arc,

I am the clapper; but you are the bell,

Tolling the gathering dark.

If you are the singer, then I am the song,

A threnody, requiem dirge.

You’ve made me the answer for all the world’s need,

Humanity’s undying urge.

~ “Threnody,” from the collected works of H.S. Socrates

First as always i hope everyone is safe. Second, i hope everyone had an indulgent and superlative holiday season heading into the New Year! Third, i decided i wanted to write a book review since i haven’t in quite some time and given the number of books i have read recently figured hey lets do a book review!

This review deserves a little context. My middle progeny was assigned this book for a winter break reading assignment. i believe it is important to take an interest in your progenies’ activities and well reading is definitely one to promote and take interest. Thus when i was talking to her about what she was assigned, she said, “i think you will like this book.” She handed it to me, and i opened it to this page:

“It is the most difficult thing a person can be asked to do. And know what it is for the greater good doesn’t make it any easier. People used to die naturally. Old age used to be a terminal affliction, not a temporary state. There were invisible killers called “diseases” that broke the body down. Aging couldn’t be reversed, and there were accidents from which there was no return. Planes fell out from the sky. Cars actually crashed. There was pain, misery, and despair. It’s hard for most of us to imagine a world so unsafe, with dangers lurking in every unseen, unplanned corner. All of that is behind us now, and yet a single simple truth remains: People must die.”

~ From the gleaning journal of H.S. Currie

My daughter knows me well. So i said ok let us read it together but don’t think this is a race I’ll probably have this book finished in a week.

Ok this had my attention. So immediately i thought of Soylent Green meets Logan’s Run but there is a twist. In reading the back cover it paraphrases a world with no hunger, no disease, no war, no misery. Humanity has conquered all those things and has even conquered death. However, who are these scythes that are mentioned, and are the only ones who can “glean” life? So with the context out of the way let us get down to business.

The cover above depicts a Scythe. As most know or should know from Websters we have the following definition:

scythe (pronounced /sīT͟H/)

noun: a tool used for cutting crops such as grass or wheat, with a long curved blade at the end of a long pole attached to which are one or two short handles.

verb: cut with a scythe as in scythed.

Given many aspects of our so-called society today and social normalizations i believe this is a wonderful teenage adventure novel that sets the stage for some more esoteric readings in science fiction such as 1984, Something Wicked This Way Comes, Brave New World, Do Androids Dream Of Electric Sheep (DADOES), Snowcrash and Neuromancer.

The book’s premise is that humans now exist in a conflict-free world where human-kind have conquered death. The world they live in is a Post Age of Morality world where one no longer has true crimes against humanity; poverty is not an issue, and hunger is solved via synthetic food engineering. As such over-population has overrun Mother Earth and elected Scythes must cull the human population. This culling process is known in the vernacular as “gleaning”. The Age of Mortality is the duration of time before the scythedom, revival centers, and the Thunderhead were established.

To this end, artificial intelligence has been amplified via the “ThunderHead” which monitors, recommends, and predicts AllTheThings. As such, there is no need for the concept or construction of a government.

Tyger shrugged, “One Splat Too Many. They gave up. Now I am a ward of the ThunderHead.”

“I’m sorry Tyger”

“Hey don’t be. Believe it or not, the ThunderHead’s a better father than my father was. I get good advice now and get asked how my day was from someone who actually seems to care.”

Just like everything else about the ThunderHead its parenting skills were indisutable.

~ Apprentice Rowan

Two teens find themselves volunteered as apprentice-Scythes which leads to a world of corruption greed and the finality of death.

Scythes are volunteered as apprentices and taught the ways of all of the classics, philosophy, chemistry (poisons), neural linguistic programming (person-reading), and of course, skilled in all the ways one can end a person’s life or if you will extreme social engagement called “killcraft”.

Scyhtes choose which lives to glean based on statistics of past Age Of Morality morbidity rates and behaviors, social class, and ethnicities. They however cannot show bias.

Scythes are ruled by a worldwide committee and meet on a quarterly basis where concerns are raised, appretences are tested, and old friendships are renewed. Did i mention that humans now live indefinitely and can rewind physical age and appearance to no lower than 21? However, given that most Sychtes choose ages between 35-45.

The Scythes lived by the following commandments:

  1. Thou Shalt Kill
  2. Thou shalt kill with no bias, bigotry or malice of afterthought
  3. Thou shalt grant an annum of immunity to the beloved of those who accept your coming and to anyone else you deem worthy.
  4. Thou shalt kill the beloved of those who resist.
  5. Thou shalt serve humanity for the full span of thy days and thy family shall have immunity as recompense for as long as you live.
  6. Thou shalt lead an exemplary life in word and deed and keep a journal of each and every day.
  7. Thou shalt kill no scythe beyond thyself.
  8. Thou shalt claim no earthly possessions save thy robe, ring and journal.
  9. Thou shalt have neither spouse or spawn.
  10. Thou shalt be beholden to no laws beyond these.

So we must ask ourselves that if in fact we solve all the so called woes of the Human Condition will we solve the root cause of the Human Condition? If we take away mortality (and morality) and can save and upload our memories then what is meant to be Human? Passion and Lust (of life)? Is compassion still needed?

Upon giving me the book to read my daughter laughed and said “Daddy maybe you are one.”

Then again, reflecting on what my daughter said to me when she was referencing the text, at the core maybe we all are Sycthes.

So if your in the market for a good book for your children or you just want a quick read that will be a good catalyst for your thoughts for our future, pick this book up. Here i even will provide the link to the ThunderHead Book Club In The Sky. Note: this is book one of a triology.

Until Then,

#iwishyouwater <- some footage from the recent 50 year storm on the left coast.

@tctjr

Muzak To Blog By: Tchaikovsky’s Symphony No. 6 in B minor, Valery Abisalovich Gergiev conducting the Vienna Philharmonic. Spectacular piece Symphony No. 6 in B minor, Op. 74, also known as the Pathétique Symphony (a.k.a. The Passionate Symphony). I recently got to see this performed by the Charleston Symphony with Jonathan Heyward conducting it was spectacular. I was sitting there thinking how someone who has certain sexual proclivities or other passions in their life that at the time were not tolerated in society could create such a work of art; then again, it just goes to show the extreme lengths humans will go to make their true passions as it were, incarnate. i also think it very ironic that this composer would probably not trend in the position of Tchaikovsky due to his political beliefs, but one never knows, does one?

Look Up Down All Around!

Your Brain 3D Printed [1]

The effects of technology do not occur at the level of opinions or concepts. Rather they alter patterns of perception steadily and without any resistance.

~ Marshall McLuhan

First i hope everyone is safe. Second, this blog is more meta-physical in nature. The above picture is a present i received from a dear friend who 3D printed it for me. A transhumanist pictorial if you will for accelerating our wetware. This brings us to the current matter at hand.

i was traveling recently and i couldn’t help but notice how many humans are just sitting, walking, running and even biking looking at their mobile devices. Families no longer talk to each other, couples no longer kiss. Kids no longer day dream. All no longer LOOK UP, DOWN and ALL AROUND.

i must confess at this juncture that, as a technologist, i am conflicted. As they say we make the guns, but we don’t pull the trigger. As a technologist, i truly love using and creating with mathematics, hardware, and software. it is an honor as far as i am concerned, and i treat it as such, yet when i have time to sit and ponder i think of the time i held the first telegraph in my hands. Yes, the FIRST telegraph that read:

What hath God wrought!

Invented and sent by Samuel Finley Breese Morse 24 May 1844. I held it. Of course it was behind plexiglass, and this is a link to said telegraph.

Why is this important? It converted numbers (morse code in this case) into a readable document, content if you will. Even if you do not believe in higher-order deities or some theistic aspects what was transmitted and received via the message of the telegraph herewith was multi-modal and carried some weight to the message.

There seems to be a trend toward a kind of primitive outlook on life a more tribal attitude and i think its a natural reaction to industrialization. Unfortunately i think it is a bit naive because the future is going to become more mechanized, computerized as you call it and i dont think there is any turning back.

~ Jim Morrison

Intelligence it seems, is now but a search engine away or if you will a “tic-tok” away. It also seems due to this immediate gratification of content and information that, we no longer talk to anyone. “The Pandemic” seems to have modified several aspects of our existence. The results of this i believe will take decades of evolution before this change is truly understood from a systems theory and first principles engineering view.

We have been sequestered into a living environment tethered to the LazyWeb(TM). Per my commentary about seeing families with their heads buried in their phones during all modes of so-called social engagement, this is creating considerable fractures in how we deal with friends, families, and most importantly ourselves.

Now in recent times, Humans are going into the office or “back to the hybrid workplace” and taking a zoom call in the adjacent meeting room to where the REAL PHYSICAL meeting is occurring. So the more i pondered, the more i thought i would post a bunch of pictures and talk about cyberspace vs real space.

Live Oak with Sunshine

i have read all the books: “Neuromancer, Cyberspace, SnowCrash,Do Androids Dream of Electric Sheep (DADOES), Super Intelligence, 1984, Brave New World, Realware etc”, i first worked on full Virtual Reality applications in 1993. Yes there were computers back then, big red ones called Silicon Graphics Crimson machines. These augmented with fixed point digital signal processing equipment created the first six degrees of freedom ( 6DOFS) head tracked stereoscopic renderings complete with spatial audio. So it is nothing new just executed in a different fashon.

i recently went to the NASA Astronaut Training Experience at the Kennedy Space Center with my eldest daughter and we took a walk on Mars and did some trivial tasks. It was tethered environment with mono-based audio however it was impressive from a simulation standpoint. When the alert system informed me that a sandstorm was coming, i was non-plussed. Having worked on top-secret systems, i understand the need for simulations entirely. Simulate all the emergencies over and over again that you can think of when going into an environment of conflict.

Double Rainbow

On a regular basis “Humans being” and living do not constitute simulation unless you buy into Bostrom’s theory that we are living in a simulation, then what of it? Please make the most of IT. Talk to that person across from you. What color do they love? What is their favorite food? Do they like puppies? If they are close friends and family, above all – show them how you feel. Hug them.

I believe that computers have taken over the world. I believe that they
have in many ways ruined our children. I believe that kids used to love
to go out and play. I believe that social graces are gone because
manners are gone because all people do is sit around and text. I think
it’s obnoxious.

~ Stevie Nicks
Sunset and Oak Tree

If you are not the talkative type go outside build a fire, Walk through the city. Go sit under a tree. If you live in a place where you can see the sky go outside and just stare at the sky and let your eyes adjust. The stars will come out and think about the fact you are made of the same substances.

Reflect on and into yourself. Shut down all the noise and chatter. Listen. What do YOU hear?

I can’t fax you my love.
I can’t e-mail you my heart.
I can’t see your face in cyberspace,
I don’t know where to start.

~ Jimmy Buffet
Full Moon At Night

When you get up in the morning, don’t start the Doom Scroll. Contemplate. Get a notebook and write some thoughts. The visceral act of writing activates differing neural patterns that allow us to remember and learn. Think about what you would like to accomplish. Hopefully, you made your bed. That is at least one thing you can check off that you did accomplish, and your parents would be proud.

i wrote a blog a while ago called Its An Honor To Say Goodbye. Many seemed to enjoy it for several different reasons. As you look up from your phone and are around, folks play a game. What if that person just disappeared as though they were shot by a BFG (Big F-in Gun) in one of the first-person shooting games and could not re-frag? Just gone from the simulation? Poof!

How would you feel?

Purple Beach Blue Night Sky

i’ll have to say if this is a simulation, it is pretty good and has to be some quantum information theoretic manifestation[2]. Yet! Feeling that embrace from a friend or loved one, feeling the spray from a wave, smelling and touching a rose, A dog licking you in the face, tasting that steak, the carnality and sensuality of it all transcend, at least for me, the “meta” aspects of the online experience.

Go Outside! The Graphics are Great!

~ Sensai Todd
Turquoise Beach Storm

So folks, when in doubt, put that device down for a bit. Go for a walk. Say hello to that person across the room and ask how the day is going, and mean it and listen. Go outside and sit against a tree at night, or take a walk near the ocean or body of water (my favorite). Draw. Shut your eyes and deeply listen to music. Dance. Make stupid sounds. Try something you have never done before. Do something besides being fed programmed content.

Look UP DOWN and ALL AROUND.

So question for all of you:

Q: Would you prefer a telegraph, facsimile or simulation of this life?

TV The Zero Day Virus

Until Then,

tctjr

#iwishyouwater <- Nathan Florence on a hellish scottish slab paddle out. He aint worried about who clicked like….

Muzak To Blog By Forestt “Into The Woods”. i would classify this as Martial Folk if i may use genre classification liberally.

[1] Someone i really respect technically and now consider a dear friend printed this out for me. He also prints body parts. Heavy stuff. He is a practicing ER doctor and also codes.

[2] On the above commentary concerning simulations, i do believe in the Minowski multi-verse theory and view of The Universe. Its all happening NOW with multiple probabilities, our noggin cant sample fast enough to reconstruct all of the information simultaneously. Also, remember, girls and boys, YOU are the universe.

[3] i took all of the pictures included herewith except the last one.

references:

[1] this is a great interview with The Lizard King (aka Jim Morrison when he was 26 in 1970. Listen. This isn’t hippie stuff. Click HERE.

Snake_Byte[11] Linear Algebra, Matrices and Products – Oh My!

Algebra is the metaphysics of arithmetic.

~ John Ray
Looks Hard.

First, as always, i hope everyone is safe, Second, as i mentioned in my last Snake_Byte [] let us do something a little more technical and scientific. For context, the catalyst for this was a surprising discussion that came from how current machine learning interviews are being conducted and how the basics of the distance between two vectors have been overlooked. So this is a basic example and in the following Snake_Byte [] i promise to get into something a little more say carnivore.

With that let us move to some linear algebra. For those that don’t know what linear algebra is, i will refer you to the best book on the subject, Professor Gilbert Strang’s Linear Algebra and its Applications.

i am biased here; however, i do believe the two most important areas of machine learning and data science are linear algebra and probability, with optimization techniques coming in a close third.

So dear reader, please bear with me here. We will review a little math; maybe for some, this will be new, and for those that already know this, you can rest your glass-balls.

We denote x\in\mathbb(R)^N be N-dimensional vectors taking real numbers as their entries. For example:

\begin{bmatrix}\Huge 0 \\ 1 \\ 2 \end{bmatrix}

where \{a_i\} are the indices respectively. In this case [3].

An M-by-N matrix is denoted as X\in\mathbb(R)^N . The transpose of a matrix is denoted as X^T. A matrix X can be viewed according to its columns and its rows:

\begin{bmatrix}  0 & 1 & 2 \\ 3 & 4 & 5\\ 6 & 7 & 8 \\ \end{bmatrix}

where \{a_i_j\} are the row and column indices.

An array is a data structure in python programming that holds fix number of elements and these elements should be of the same data type. The main idea behind using an array of storing multiple elements of the same type. Most of the data structure makes use of an array to implement their algorithm. There is two important parts of the array:

  • Element: Each item stored in the array is called an element.
  • Index: Every element in the array has its own numerical value to identify the element.

Think of programming a loop, tuple, list,array,range or matrix:

from math import exp
v1 = [x, y] # list of variables
v2 = (-1, 2) # tuple of numbers
v3 = (x1, x2, x3) # tuple of variables

v4 = [exp(-i*0.1) for i in range(150)] #ye ole range loop

and check this out for a matrix:

import numpy as np
a = np.matrix('0 1:2 3')
print (a)
output: [[0 1]
 [2 3]]

which folks is why we like the Snake Language. Really that is about it for vectors and matrices. The theory is where you get into proofs and derivations which can save you a ton of time on optimizations.

So now let’s double click on some things that will make you sound cool at the parties or meetups.

A vector can be multiplied by a number. This number a is usually denoted as a scalar:

a\cdot (v_1,v_2) = (av_1,av_2)

Now given this one of the most fundamental aspects in all of machine-learning is the inner product, also called dot product, or scalar product, of two vectors, is a number. Most of all, machine learning algorithms have some form of a dot product somewhere within the depths of all the mathz. Nvidia GPUs are optimized for (you guessed it) dot products.

So how do we set this up? Multiplication of scalar a and a vector (v_1,\dots,v_{n-1}) yields:

(av_0,\dots,av_{n-1})

Ok good so far.

The inner or dot product of two n-vectors is defined as:

(u_0,\dots,u_{n-1})\cdot(v_0,\dots,v_{n-1}) = u_0v_0 +,\dots,+ u_{n-1}v_{n-1}

which, if you are paying attention yields:

(1)   \begin{equation*} = \sum_{j=0}^{N-1}{u_jv_j}\end{equation*}

Geometrically, the dot product of U and V equals the length of U times the length of V times the cosine of the angle between them:

\textbd{U}\cdot\textbf{V}=|\textbf{U}||\textbf{V}|\cos\theta

ok so big deal huh? yea, but check this out in the Snake_Language:

# dot product of two vectors
 
# Importing numpy module
import numpy as np
 
# Taking two scalar values
a = 5
b = 7
 
# Calculating dot product using dot()
print(np.dot(a, b))
output: 35

hey now!

# Importing numpy module
import numpy as np
 
# Taking two 2D array
# For 2-D arrays it is the matrix product
a = [[2, 1], [0, 3]]
b = [[1, 1], [3, 2]]
 
# Calculating dot product using dot()
print(np.dot(a, b))
output:[[5 4]
       [9 6]]

Mathematically speaking the inner product is a generalization of a dot product. As we said constructing a vector is done using the command np.array. Inside this command, one needs to enter the array. For a column vector, we write [[1],[2],[3]], with an outer [], and three inner [] for each entry. If the vector is a row vector, the one can omit the inner []’s by just calling np.array([1, 2, 3]).

Given two column vectors x and y, the inner product is computed via np.dot(x.T,y), where np.dot is the command for inner product, and x.T returns the transpose of x. One can also call np.transpose(x), which is the same as x.T.

 # Python code to perform an inner product with transposition
 import numpy as np
 x = np.array([[1],[0],[-1]])
 y = np.array([[3],[2],[0]]) 
 z = np.dot(np.transpose(x),y)
print (z) 


Yes, now dear read you now can impress your friends with your linear algebra and python prowess.

Note: In this case, the dot product is scale independent for actual purposes of real computation you must do something called a norm of a vector. i won’t go into the mechanics of this unless asked for further explanations on the mechanics of linear algebra. i will gladly go into pythonic examples if so asked and will be happy to write about said subject. Feel free to inquire in the comments below.

Unitl Then,

#iwishyouwater <- Nathan Florence with Kelly Slater at the Box. Watch.

tctjr.

Muzak to Blog By: INXS. i had forgotten how good of a band they were and the catalog. Michael Hutchinson, the lead singer, hung himself in a hotel room. Check out the song “By My Side”, “Dont Change” and “Never Tear Us Apart” and “To Look At You”. They weren’t afraid the take production chances.

Note[2]: i resurrected some very old content from a previous site i owned i imported the older blogs. Some hilarious. Some sad. Some infuriating. i’m shining them up. Feel free to look back in time.

Snake_Byte[10] – Module Packages

Complexity control is the central problem of writing software in the real world.

Eric S. Raymond
AI-Generated Software Architecture Diagram

Hello dear readers! first i hope everyone is safe. Secondly, it is the mondy-iest WEDNESDAY ever! Ergo its time for a Snake_Byte!

Grabbing a tome off the bookshelf we randomly open and it and the subject matter today is Module Packages. So there will not be much if any code but more discussion as it were on the explanations thereof.

Module imports are the mainstay of the snake language.

A Python module is a file that has a .py extension, and a Python package is any folder that has modules inside it (or if your still in Python 2, a folder that contains an __init__.py file).

What happens when you have code in one module that needs to access code in another module or package? You import it!

In python a directory is said to be a package thus imports are known as package imports. What happens in import is that the code is turned into a directory from a local (your come-pooter) or that cloud thing everyone talks about these days.

It turns out that hierarchy simplifies the search path complexities with organizing files and trends toward simplifying search path settings.

Absolute imports are preferred because they are direct. It is easy to tell exactly where the imported resource is located and what it is just by looking at the statement. Additionally, absolute imports remain valid even if the current location of the import statement changes. In addition, PEP 8 explicitly recommends absolute imports. However, sometimes they get so complicated you want to use relative imports.

So how do imports work?

import dir1.dir2.mod
from dir1.dir2.mod import x

Note the “dotted path” in these statements is assumed to correspond to the path through the directory on the machine you are developing on. In this case it leads to mod.py So in this case directory dir1 which is subdirectory dir2 and contains the module mod.py. Historically the dot path syntax was created for platform neutrality and from a technical standpoint paths in import statements become object paths.

In general the leftmost module in the search path unless it is a home directory top level file is exactly where the file presides.

In Python 3.x packages changed slightly and only applies to imports within files located in package directories. The changes include:

  • Modifies the module import search path semantic to skip the package’s own directory by default. These checks are essentially absolute imports
  • Extension of the syntax f from statements to allow them to explicitly request that imports search the packages directories only, This is the relative import mentioned above.

so for instance:

from.import spam #relative to this package

Instructs Python to import a module named spam located in the same package directory as the file in which this statement appears.

Similarly:

from.spam import name

states from a module named spam located in the same package as the file that contains this statement import the variable name.

Something to remember is that an import without a leading dot always causes Python to skip the relative components of the module import search path and looks instead in absolute directories that sys.path contains. You can only force the dot nomenclature with relative imports with the from statement.

Packages are standard now in Python 3.x. It is now very common to see very large third-party extensions deployed as part of a set of package directories rather than flat list modules. Also, caveat emptor using the relative import function can save memory. Read the documentation. Many times importing AllTheThings results in major memory usage an issue when you are going to production with highly optimized python.

There is much more to this import stuff. Transitive Module Reloads, Managing other programs with Modules (meta-programming), Data Hiding etc. i urge you to go into the LazyWebTM and poke around.

in addition a very timely post:

PyPl is running a survey on packages:

Take the survey here -> PyPl Survey on Packages

Here some great comments and suggestions via Y-Combinator News:

Y-Combinator News Commentary on PyPl Packages,

That is all for now. i think next time we are going to delve into some more scientific or mathematical snake language bytes.

Until Then,

#iwishyouwater <- Wedge top 50 wipeouts. Smoookifications!

@tctjr

MUZAK TO BLOG BY: NIN – “The Downward Spiral (Deluxe Edition)”. A truly phenomenal piece of work. NIN second album, trent reznor told jimmy iovine upon delivering the concept album “Im’ Sorry I had to…”. In 1992, Reznor moved to 10050 Cielo Drive in Benedict Canyon, Los Angeles, where actress Sharon Tate formally lived and where he made the record. i believe it changed the entire concept of music and created a new genre. From an engineering point of view,  Digidesign‘s TurboSynth and  Pro Tools were used extensively.

It Is An Honor To Say “GoodBye”.

No one ever told me that grief felt so like fear.

C.S. Lewis
An AI-Generated Image

First, i hope everyone is safe, especially on this day when belief systems ran completely amok. Second, this day also holds a place for me that i will not go into but if you are a good internet sleuth you can figure it out.

Today i did something i have never done nor did i think i could do because of several factors. However into the breach once more and low and behold i pulled it off. The man with me is an expert at this activity and gave me some pointers as to how to perform the said activity. As i was saying goodbye to the man who is one of the closest people in my life we volitionally hugged each other and shook hands a certain way.

On this day i reflected on an Uber ride that i had years ago where a man picked me up. We started talking as it was a pretty good drive from SFO to the Marines’ Memorial Club & Hotel where i was speaking.

There are places I’ll remember
All my life though some have changed
Some forever, not for better
Some have gone and some remain

All these places have their moments
With lovers and friends I still can recall
Some are dead and some are living
In my life I’ve loved them all

The driver as it turns out was a former senior salesperson at salesforce. As i always say you never know what someone has been through so don’t judge them by how they make a living. We discussed most of the “-isms” and then he said, “Mr Ted i found comfort in the christian bible. Have you read it?” i said i have read it three times and i prefer the old testament. i asked him why? he said it helped him through the hard times of his life. He was talking about his family in past tense and i was very sensitive to prying to much into his business. i asked him what type of hardships. He said his family lived during the years of Pol Pot and the Cambodian genocide and his family were all murdered. i really didnt know what to say except “My Condolences”. He said, “Thank you Mr Ted. i have found peace and remember it is an honor to say goodbye to someone and to always make it count as you never know when you will see them again. As a matter of fact i do not tell people Goodbye i say i love you or be safe.”

We arrived at the Marine Hotel. We got out of the car and he said , “Mr Ted it has been an honor speaking with you i hope you enjoy your life. Be Safe Mr Ted.”

That left an indelible imprint on my mind.

Though I know I’ll never lose affection
For people and things that went before
I know I’ll often stop and think about them
In my life I love you more

On 9.11 – Today many lost loved ones. Grief, as Mr Lewis states, is very much like fear except you cannot Un-Grieve. You can be unafraid. Grief, as it turns out, is never-ending. There is no invertible transformation that makes you not grieve.

We have been so programmed to buck it up – suck it up, buttercup that everything tries to gloss over the loss. Whether a human or a family pet, it is ok to grieve. There are people and animals in my life that i will never recover from losing and for the longest time i beat myself up for not bucking up buttercup.

Further contemplating this i believe Grief is fractal. Zoom in on a fractal it evolves and changes yet holds the same shape ad infinitum1.

Mandelbrot Set Generated Fractal

Grief as it turns out appears at least to me to be closely aligned. The more you peel it back the more complex it gets.

Same Fractal Zoomed

Does time heal Grief? Not really. It is the memory that fades. Ergo other memories fade as a function of our leaky memory system.

We deal with healing in different ways. The Uber driver found solace in a religious text, others workout, some self-medicate, others try to replace the human or animal.

We want it to go away.

i say we should acknowledge the pain of grief and realize it and let it happen then further acknowledge that the next person or animal who is essential to you, use the opportunity and find strength in telling them “Be Safe, See ya Real Soon, or i love you more.” However above all, if you cherish that friend or loved one, it is an honor to tell them upon them walking out the door. Let them know it.

Until Then,

#iwishyouwater. <- Laird Hamilton on a Paddle board

@tctjr

Muzak To Blog By: A band called Papir.

[1] The Mandelbrot set is the set of complex numbers c for which the function

    \[f_{c}=z^2+c\]

does not diverge to infinity when iterated from

    \[z=0\]

Snake_Byte[9] XKCD PLOTS

An algorithm must be seen to be believed.

~ D. Knuth

First i trust everyone is safe. Second its WEDNESDAY so we got us a Snake_Byte! Today i wanted to keep this simple, fun and return to a set of fun methods that are included in the defacto standard for plotting in python which is Matplotlib. The method(s) are called XKCD Style plotting via plt.xkcd().

If you don’t know what is referencing it is xkcd, sometimes styled XKCD, whcih is a webcomic created in 2005 by American author Randall Munroe. The comic’s tagline describes it as “a webcomic of romance, sarcasm, math, and language”. Munroe states on the comic’s website that the name of the comic is not an initialism but “just a word with no phonetic pronunciation”. i personally have read it since its inception in 2005. The creativity is astounding.

Which brings us to the current Snake_Byte. If you want to have some fun and creativity in your marketechure[1] and spend fewer hours on those power points bust out some plt.xkcd() style plots!

First thing is you need to install matplotlib:

pip install matplotlib

in this simple example we need numpy:

pip install numpy
import numpy as np
plt.xkcd() 
plt.plot(np.sin(np.linspace(0, 10)))
plt.plot(np.sin(np.linspace(10, 20)))
plt.title('Sorta Lissajous')
Sorta Lissajous

So really that is all there with all the bells and whistles that matplotlib has to offer.

The following script was based on Randall Munroe’s Stove Ownership.

(Some will get the inside industry joke.)

with plt.xkcd():
    # Based on "Stove Ownership" from XKCD by Randall Munroe
    # https://xkcd.com/418/

    fig = plt.figure()
    ax = fig.add_axes((0.1, 0.2, 0.8, 0.7))
    ax.spines.right.set_color('none')
    ax.spines.top.set_color('none')
    ax.set_xticks([])
    ax.set_yticks([])
    ax.set_ylim([-30, 10])

    data = np.ones(100)
    data[70:] -= np.arange(30)

    ax.annotate(
        'THE DAY I TRIED TO CREATE \nAN INTEROPERABLE SOLTUION\nIN HEALTH IT',
        xy=(70, 1), arrowprops=dict(arrowstyle='->'), xytext=(15, -10))

    ax.plot(data)

    ax.set_xlabel('time')
    ax.set_ylabel('MY OVERALL MENTAL SANITY')
    fig.text(
        0.5, 0.05,
        '"Stove Ownership" from xkcd by Randall Munroe',
        ha='center')
Interoperability In Health IT

So dear readers there it is an oldie but goodie and it is so flexible! Add it to your slideware or maretechure or just add it because its cool.

Until Then,

#iwishyouwater <- Mentawis surfing paradise. At least someone is living.

Muzak To Blog By: NULL

[1] Marchitecture is a portmanteau of the words marketing and architecture. The term is applied to any form of electronic architecture perceived to have been produced purely for marketing reasons and has in many companies replaced actual software creation.