Skied up Mount St. Helens
Clint, Craig and I skied up Mt. St. Helens yesterday. I ate steak and potatoes the night before but it wasn’t enough. At the halfway point we could see figures on the summit, but my endurance was very low. I could skin up about 100 feet, then I’d have to pause for a while to catch my breath, and repeat. The distances between pauses got shorter the higher I got.
I apologized to Clint for going so slow, and he said he wasn’t in any hurry, that I should take my time. My son and Craig are both superb skiers, while I am mediocre at best. I was the oldest guy on the mountain by at least 20 years. Most of them were between 25 and 35. Most of the 100 people we saw were on snowshoes or on foot. It’s weird that only about 1 in 5 were on skis. Skis made the trip down a lot faster, and in spots the skiing was excellent, feeling like the steep section of a blue run at Crystal i.e. the Forest Queen chair lift. The skiing down Worm Flow couloir was fairly easy in spring corn, though it had some cement snow in places.
Down lower on the main hiking route there were tons of post hole tracks which made for choppy skiing. I was so tired by then (10 mile round trip, 4500 feet gain) that all I could do was shaky snow plow turns. We were racing the light, as getting caught in the dark on icy snow was not an attractive option.
I’ve been studying Python. My Django studies got weird because Django runs in Python and I’d never done Python. Sometimes I feel like I’m trying to learn Japanese, French, Russian and Spanish all at the same time.
This was my latest exercise. This is the Python code for pulling JSON data from the USGS Earthquake website. It formats the data and then spits it back out as a list of recent earthquakes with information about strength, location and whether it was felt.
# # Example file for parsing and processing JSON # import urllib import json def printResults(data): # Use the json module to load the string data into a dictionary theJSON = json.loads(data) # now we can access the contents of the JSON like any other Python object if "title" in theJSON["metadata"]: print(theJSON["metadata"]["title"]) # output the number of events, plus the magnitude and each event name count = theJSON["metadata"]["count"] print (str(count) + " events recorded") # for each event, print the place where it occurred for i in theJSON["features"]: print(i["properties"]["place"]) print("-----------------\n")#this runs after the for loops is done # print the events that only have a magnitude greater than 4 for i in theJSON["features"]: if i["properties"]["mag"] >= 4.0: print "%2.1f" % i["properties"]["mag"], i["properties"]["place"] print("-----------------\n")#this runs after the for loops is done # print only the events where at least 1 person reported feeling something print("Events that were felt: ") for i in theJSON["features"]: feltReports = i["properties"]["felt"] if feltReports != None: if feltReports > 0: print "%2.1f" % i["properties"]["mag"], i["properties"]["place"], " reported " + str(feltReports) + " times" def main(): # define a variable to hold the source URL # In this case we'll use the free data feed from the USGS # This feed lists all earthquakes for the last day larger than Mag 2.5 urlData = "http://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/2.5_day.geojson" # Open the URL and read the data webUrl = urllib.urlopen(urlData) print ("result code: " + str(webUrl.getcode()))#returns: 200 = true, it got data if (webUrl.getcode() == 200): data = webUrl.read() printResults(data) else: print("Received error, cannot parse results") if __name__ == "__main__": main()