Recently, at work, we got a 3D printer. I work at a health care center, serving mostly poor; people on Medicaid, or without insurance. People have asked, what does 3D printing have to do with that? Are you going to print syringes?
As I've been thinking a lot about it, 3D printers, at least in my work space, are about fostering creativity. How do we get people to think more creativity, not only about what they put down on paper or canvas, but how they live their lives and promote health around them? Does learning how to design and print 3D objects help empower people to be more creative? Does it even, simply, get people who should be getting primary health care, in the door?
How do we use having a 3D printer in our innovation center, to encourage people to come forward with creative ideas? Does fostering creativity in one realm, like 3D design, encourage creative thinking in another realm, like public health? These are issues for me to explore.
So far, I've been testing the 3D printer, getting to know what it does and doesn't do, getting to know how to operate it must effectively. So far, I've printed a couple comes and an Ingress Enlightened game insignia. I've started looking at 3D design and exploring different design packages. There are several free ones, like Sketchup and Blender. I've used both in the past, and I'm starting to relearn them to see if I can make some neat objects.
I started thinking about 3D design back when I was active in Second Life, a 3D virtual world. I’ve encouraged people to use 3D virtual games to create animated videos. These days, my youngest daughter plays a lot in Minecraft and related games. I like Minecraft much better than a lot of the other games she plays because it is a game that encourages rudimentary 3D design. Can I use it as a gateway to Sketchup, Blender, and Opensim for her?
I’ll continue to work on my 3D design skills. I’ll try to find others interested in these skills near where I work. It may not lead to a cure for cancer, but if it can provide even a small spark that improves the health of our communities, it will be worth it.
Do you do any 3D design or printing? Are their systems, tutorials, or projects you recommend? Let me know.
I’ve been looking at app development recently and speaking with different people about the tools they use. One of them mentioned MongoDB. MongoDB is a document oriented NoSQL database. I loaded it on one of my servers and played with it a little. I was impressed with the simplicity of getting started with it.
Yet as we move away from tabular storage of data, it poses the question, how should we think about organizing information?
There is the great line from The Cluetrain Manifesto, Hyperlinks subvert hierarchy. There is a lot in that statement. To what extent are hyperlinks subverting authoritarian structures based on hierarchy? How is this playing out in media, education, and politics? And, how is it playing out in organizing information?
As I dig a little deeper into NoSQL databases, I’m finding myself more interested in triple store or graph oriented databases. Instead of having a limited, predefined set of relationships like, parent child in a traditional relational database, what can we do when we start storing many different types of relationships in databases? What can we do when we start graphing out this information?
So, on my radar for future explorations are Neo4j and Sparql. From there, I may wander back into topics like RDF, the Semantic Web, and of course once information becomes more machine readable, back to the singularity.
Are you playing with MongoDB? Neo4J? Sparql? RDF? The Semantic Web? What things do you think I should be looking into? Are there good starting points and tutorials?
This evening, I sat down to my evening positive attitude adjustment, and found Howard Rheingold had shared on Facebook a link to Jason Feifer’s comments in Fast Company, GOOGLE MAKES YOU SMARTER, FACEBOOK MAKES YOU HAPPIER, SELFIES MAKE YOU A BETTER PERSON
It was, in my opinion, a very well written response to Sherry Turkle’s recent Op-Ed in the New York Times, The Documented Life where she complains about Selfies.
My Initial reaction to Turkle’s piece was to write Sisyphus’ Selfie. I’ve been intending to write more on this, and I started to write a comment to Howard’s status. Yet as it grew, I thought I should really make it part of my blog post.
I started off:
I must say, as an active participant in LambdaMOO back in the mid 90s and a friend of many of the researchers and cyberanthropoligists that became involved there. I've always found Turkle to be a bit full of herself (and other stuff).
I read her Op-Ed and found that my opinion of her hasn't improved over the past 18 years. I've been meaning to write a blog post about her article, very similar to Feifer's, but perhaps from a slightly different angle.
This is where I decided to merge the comment into this blog post. One person suggested, why not just call Turkle a Luddite, and then went on to repeat various assertions of Turkle that are tangential to the article, claiming them to be facts.
I think Luddite is an overused word amongst technophiles and so I want to present a slightly different idea.
Marc Prensky, in his famous article, Digital Natives, Digital Immigrants presents the idea of people who have grown up in a digital culture as digital natives. Those who have moved into a digital culture, having grown up in a different culture are digital immigrants.
In my mind, this fits nicely with some of what Turkle talks about. Yes, growing up in a digital culture does change the way we think and act. Yet this also points to the biggest problem with what Turkle has to say.
She is looking at digital culture from the viewpoint of a digital immigrant. For example, her comment
We don’t experience interruptions as disruptions anymore. But they make it hard to settle into serious conversations with ourselves and with other people because emotionally, we keep ourselves available to be taken away from everything.
This sure sounds a lot to me like that old grandmother living in the immigrant community complaining about how people these days just don’t do things the way they used to in the old world, and how much better the old world was.
I pause to think a little more and glance at my daughter creating something in Mindcraft. She is a digital native. Me? Having been on the Internet for over thirty years, and on bulletin boards and programming computers long before this, I tend to think of myself as a digital pioneer, or perhaps a digital aborigine.
Yes, working with computers for all these years has changed my way of thinking. A critic might compare it to the way mercury changed the thinking of hatmakers, and my children might have other comments about having a Dad that has been online longer than they have.
Yet I relish my experiences with technology and I’m glad that my children are having even greater experiences with it. I love the camaraderie of other digital pioneers or digital aborigines.
Through my discussions with friends on Facebook, I’ve also found myself talking about Jacques Ellul, whether or not people need to learn to program, representations of transhumanism, The Power of Patience and Civil Religion and how it relates to prophetic religion, the social contract, the way we interact through digital media, and if there are implications for a Great Awakening.
And, for that matter, I let a young college student from Iran borrow my Google Glass this afternoon, so he could take a selfie of him wearing Google Glass, standing next to a robot.
Technology does change the way we think and act. There is much that needs to be discussed about it. I’m happy that Facebook has given me topics to Google and become smarter about. I’m just not sure that Turkle is really adding much of value to the conversation.
Rabbit, Rabbit, Rabbit. We'll here we are, another October. Like other months, when I get time, I start off with a childhood invocation for good luck.
But it's October, thirty-seven years ago, a classmate of mine from high school disappeared. They found her body later in the month, but never found the murderer. Last year, during Hurricane Sandy, towards the end of October, my mother died in a car accident.
Looking back over my career, many of my job changes took place in October. My youngest daughter was born in October, as were some of my closest long time friends.
It's October, and the Government is shut down. This weekend, I sat on the porch, after making a batch of green apple jelly. Yes, I'm connected online. With my Google Glass, I get notifications as they happen. But there is something about sitting on the porch, having just made jelly.
I thought about when my mother was a kid. Yes, she heard, via the radio fairly quickly about the bombing of Pearl Harbor, but most news was much slower then, and even more slow before the radio and telegraph. How much is this always on, instant notification contributing to disfunction in Washington, where people seem more interested in the political theatre of the sound bite than in sound governing?
How much is the medium the message?
I've been reading The Blithedale Romance by Nathaniel Hawthorne. The setting is a utopian community in the mid nineteenth century. The hero is sick and reads books that other members of the community bring to him. Yet I'm reading it as an ebook on my smartphone. What is the mixed message of a nineteenth century novel on a twenty-first century device?
Kim and I have started watching "H+". It is a series about human implants, similar to Google Glass and a mass kill off of people with the implants due to a network virus. The medium is the message, as my wife and I watch it on an old TV hooked up to an old Roku which manages to still get YouTube. I watched an episode on Google Glass, which pushes the medium is the message idea even further.
And here I am, writing a blog post about it.
It is a post-apocalyptical world and I've been thinking about this new millennialism, a resurgence of apocalyptical thinking. No, we didn't have a Mayan apocalypse. We haven't had an apocalypse as a result of people of the same gender who love each other now being able to marry one another.
Now, even though the Federal Government is shutdown, you can go online and purchase health insurance. Like same-sex marriage, for some this looks like the end of the world. For others, the Federal Government shutdown looks like the end of the world.
But as I sat on the porch over the weekend, with a kitchen full of jams and jellies that I've made, and as I sit in my chair now, writing my blog post and listening to the large dog snore on the couch next to me, this is nothing like the end of the world in all the dystopian post-apocalyptical stories.
So I say Rabbit, Rabbit, Rabbit, bringing back all the simple childhood hopes and memories in this complicated hyper-connected world as I think of dogs and jelly and porches, and trying to get back to sleep.
Matthew Katz recently posted a link to his article in KevinMD, Google Glass for medicine: 4 reasons why it could be disastrous saying:
Am I just turning into a technophobe? My post on KevinMD about Google Glass.
As a person who has been using Google Glass for the past three months in a health care setting, I believe you have become a technophobe.
Privacy Violations: The same issue applies to cellphones. Are you going to ban them from your practice?
Hackable: Personal computers are hackable as well. Ban them? (I worked with security for a Swiss bank two decades ago when they said they'd never connect to the Internet because of security issues. There are risks with all technology, just like everything else in life. You can't ban life, instead, you need to mediate risks)
Concern with multitasking: This is probably the strongest point, which also seems pretty weak, based on my experience with Google Glass. Yet the interruptions I get from Google Glass, wearing it all the time, is similar to the interruptions I get from phone calls, overhead pages, and other staff members knocking on my door.
Google’s And medicine’s goals aren’t aligned: Again, on the surface, this seems like a valid point. However, from my experience dealing with pharmaceutical companies, medical device manufactures, and insurance companies, I suspect that Google's goals may be more closely aligned with medicine's goals than most companies working in health care.
Over on his article, I added a couple additional thoughts, edited for the blog here:
The other point that I would make is that Google Glass is not in BETA. Ity is not even in ALPHA. It is still a prototype. I think it is premature to make determinations about what a prototype is likely to do to a business. You might want to go back and look at the history of the Xerox.
The Smithsonian Article, Making Copies is a good starting point.
At first, nobody bought Chester Carlson's strange idea. But trillions of documents later, his invention is the biggest thing in printing since Gutenburg
Companies turned down the xerox machine because so few people made copies prior to it, they didn't think it would sell.
My experience with Glass, so far, is similar to my experiences with the Apple Newton in the early 90s. A lot of people didn't think much of the Newton back then, and it never really took off, but it laid the groundwork for smartphones today.
I wouldn't be surprised to see Glass follow a similar path and in twenty years be an all but forgotten precursor to ubiquitous wearable computing.
One last thought: it is worth looking at the Technology Adoption Life Cycle, as written about back in the 50's, particularly by Everett Rogers in his book Diffusion of Innovations.
Google Glass is at the very front end of the adoption lifecycle, where only a few innovators have been using it. As has become more and more common these days, when a new innovation comes along, it often gets a backlash. It seems that the backlash against an innovation is proportional to potential disruption the innovation carries.
As a final comment, I'd encourage you to read a blog post I wrote back in 2007 about Twitter:
In a previous post about ad:tech, I mentioned how I learned about NY Times' Facebook page from a twitter by Steve Rubel. I commented about this in the press room, and one of the reporters was surprised to hear that twitter was still around and active. I reflected back on hearing speakers at OMMA predict the demise of Twitter, Facebook and Second Life and it struck me that the standard technology adoption curve that we all hear so much about, may have a lot of interesting nuances.