In my current role as Emerging Technologies Coordinator for University of Oklahoma Libraries, I explore/develop/deploy new tech for use in research and instruction. Below are a few examples of what I mean. Please don't hesitate to reach out - via the Personal page - for more info or to collaborate. 

The Sparq labyrinth is an interactive meditation tool. With a touch-screen interface, the Sparq user selects from a variety of culturally significant labyrinth patterns and then engages (i.e. walks, performs yoga, or even dances) the projected pattern to attain a refreshing connection to the moment. This five-minute mindfulness technique requires no training, and has been linked to decreases in systolic blood-pressure and increased quality of life, which makes the Sparq the perfect wellness solution for your stressful workplace.

How can we be sure? Because the Sparq has been deployed across the nation in a diversity of different settings. Indeed, everyone from academic researchers (and stressed out students) - at the UMass Amherst, the University of Oklahoma, Concordia University, and Oklahoma State University -  to Art Outside festival goers; to Nebraskan wine tasters have experienced the benefits of this interactive mindfulness tool. 

The Sparq provides for a uniquely personal meditation experience. With touch-screen access to a variety of patterns - each representing a distinct cultural heritage - the Sparq users connect with history while reconnecting with themselves.

Unlike traditional labyrinth installations, the Sparq is mobile and (after the components have shipped) it can be set up in about an hour. This ease of installation, combined with the stunning beauty of the projected patterns, makes the Sparq a wellness solution suitable for nearly any workplace

The Sparq provides for a uniquely personal meditation experience. With touch-screen access to a variety of patterns - each representing a distinct cultural heritage - the Sparq users connect with history while reconnecting with themselves.

Ready for a Sparq? Contact me, and I'll make available tons more information about the thinking/motivation behind the Sparq, links to documented benefits, and instructions concerning how to set up the system at your institution. Then you can experience for yourself the myriad benefits of the Sparq meditation labyrinth.

 In Pima & Papago (native American) cultures the design below represents "Siuu-hu Ki" - "Elder Brother's House". Legend has it that, after exploiting the village, the mythical Elder Brother would flee, following an especially devious path back to his mountain lair so as to make pursuit impossible. Elder Brother's House is one of several culturally significant labyrinth patterns which lend a powerful gravity to the overall Sparq experience.

"Hypnose" - Rapid Prototying project

Bruton, Eric. Clocks & Watches. New York: Hamlyn Publishing Group, 1968.

OU Libraries' new makerspace/fab lab/incubator Innovation @ the EDGE is centered on the idea that demystification of emerging technology is critical non-STEM engagement. Since my academic background is in the humanities (philosophy), a demonstration of rapid prototyping that takes inspiration from our large collection seemed important. Hence, the Hypnose smell-clock - a mostly 3D printed prototype that incorporated microcontroller components, and programming, inspired by the sorts of historical examples described in History-of-Timekeeping texts found in the book stacks (as above).

Bronze Head of Hypnose from Civitella d'Arna

The original motivation for the Hypnose was simple: there are problems associated with waking up and checking one's smartphone to figure out if it is indeed time to wake up! Of course, alarms are a solution, although they aren't necessarily a pleasant way to start your day. Moreover, there are temptations (e.g. social media) associated with picking up your phone in the middle of the night. How to avoid the phone, then, and still get up for work in time? Why not train myself to subconsciously to wake up on time by associating different phases of my sleep cycle with distinct scents?

This implementation used an Arduino Uno along with a SparkFun motor shield to power a stepper motor via a wall outlet. The precise rotational control provided by a stepper motor (as opposed to a torque-heavy servo) allows the below code to "jump" a measuring spoon - containing a small amount of scented wax melt - to a position directly above a heat lamp. This jump is programmed to occur every hour (3,600,000 miliseconds in Arduino code time), which can be easily doubled to cover an 8-hour sleep cycle, given four spoons. A certain wax melt, then, would always correspond to the final two hours before one awakes. I will undoubtedly come to dread that smell! 

int dirpin = 2;
int steppin = 3;

void setup() 
pinMode(dirpin, OUTPUT);
pinMode(steppin, OUTPUT);
void loop()

  int i;

  digitalWrite(dirpin, LOW);     // Set the direction.

  for (i = 0; i<400; i++)       // Iterate for 4000 microsteps.
    digitalWrite(steppin, LOW);  // This LOW to HIGH change is what creates the
    digitalWrite(steppin, HIGH); // "Rising Edge" so the easydriver knows to when to step.
    delayMicroseconds(1000);      // This delay time is close to top speed for this
  }                              // particular motor. Any faster the motor stalls.

                         // particular motor. Any faster the motor stalls.


The assembly, originally modeled in Sketchup (above), takes its cue from a 1st century bronze sculpture discovered in central Italy. According to Wikipedia, Hypnos' cave had no doors or gates, lest a creaky hinge awake him. It seems we both faced similiar problems. Also, this ancient realization of the greek god of sleep, conveniently lacked eyes, which are actually holes in the sculpture. My thinking was that the scent could vent from those holes with the aid of a small computer fan, although the final prototype uses Hypnos as more of an aesthetic choice. 

The Hypnose "face" - an amalgamation of a free, low-poly mask model found online and a set of wings, scaled and rotated - ultimately took close to 8 hours (and 3 tries) on our Makerbot printer, but the finished prototype works more or less perfectly. More importantly, OU Libraries now offers free training on all the tech associated with this project, so those once-intimidated humanities majors (like myself) can leverage that creativity they are known for, inspired perhaps by source material in our collection, to design and deploy their own creations.

We are in a second proof-of-concept stage for a mobile app that guides users through large indoor while providing a plethora of location-based info and relevant push notifications (e.g. events, technology tutorials, etc.) along the way. The ongoing OU libraries-based pilot program has paved the way for a campus wide rollout of this cutting edge technology. This tier two launch coincides with the Galileo’s World exhibition, which debuted in August of 2015. The tool now provides:

  • Integration of Online/offline University of Oklahoma user experience by providing real-time, turn-by-turn navigation.
  • Delivery of hyper-local contents, corresponding to the users location with respect to campus resources both indoors and out.
  • Powerful analytics capabilities, which allow for the analysis of space/service/technology usage throughout navigable areas.
  • Various associated utilities to assist disabled users as well as aid in emergency situations.

People tend to refer to the central routing feature as “indoor GPS”. It’s accurate at up to a meter and it fulfills a goal we started focusing on early last year: simplify an extraordinarily complex physical environment.

Bizzell, after all, is huge – and filled with services (some of which I’m barely familiar with myself). What we didn’t want, then – and is something I've seen personally - is a senior level undergraduate proudly proclaiming that they are using our facilities for the first time.

Basically, our aim from the beginning was to put an end to the intimidation factor that new students might feel when visiting the library for the first time while at the same time making our diverse services visible to visitors using an increasingly prevalent piece of pocket-sized hardware, the Smartphone.

At the end of the 2015/16 academic year – the first semester where the NavApp was available for (free) public download – ~2,000+ unique users had downloaded and engaged with this innovative wayfinding tool. Indeed, our engagement factor was particularly encouraging with back-end analytics indicating that, on average, individual users accessed more than 16 in-app screens. 

Finally, the press has been responding positively the NavApp and we've even received national awards for our work on this project. Please reach out to find out how to deploy your wayfinding tool. 

After months of R&D, OVAL 1.0 is ready for use. With this hardware/software platform, instructors and researchers alike can quickly populate a custom learning space with fully interactive 3D objects from any field. Then, they can share the analysis of those models across a network of virtual reality headsets - regardless of physical location or technical expertise. In this way, you are free to take your students or co-researchers into the "field"  without leaving campus!

CHEM 4923, group RNA fly-through. 

Not only are previously imperceptible/fragile/distant objects (like chemical molecules, museum artifacts, historical sites, etc.) readily accessible in this shared learning environment, but - using our public facing file uploader - even the most novice users can easily drag-and-drop their 3D files into virtual space for collaborative research and instruction in virtual reality. Simply upload and sit down to begin.

Custom fabricated, library-designed VR workstation - courtesy of OU Physics dept.

Finally, natural interaction types - like leaning in get a closer look at a detailed model - are preserved and augmented by body tracking technology. When coupled with intuitive hand-tracked controls (one less piece of software to learn!), and screenshot + video capture functions for output to downstream applications (e.g. publication + presentations), new perspectives can be achieved and captured to aid your scholarship.

"The impact on the students this week was immeasurable", says one OU faculty member who has already incorporated the OVAL into her coursework. How can we help you achieve the same impact? Please reach out for a personal consultation and let OU Libraries show you how this powerful tool, which is currently available for walk-in use in Innovation @ the EDGE, can support your educational goals.

3D Scanning - Experiments & Implications

My current professional focus on 3D visualization has led to experimentation with a host of scanning solutions. Basically, the goal is a more accurate digitization - an interactive snapshot with searchable/browsable depth. 

The 3D assets below were generated using a the Sony DSC-RX100 (for capturing high-definition, multi-angle stills of the specimens)  and Autodesk Memento (for stitching those stills together into a surface mesh).

Please reach out, via the personal pageif you have a collection/antique/artifact/specimen that you would like to see preserved in this robust digital format. 

The above prickly pear scan isn't perfect, but it's the only usable botanical scan that I've managed to generate after a half-dozen tries. Narrow-width connecting components (e.g. stems) in particular seem to disappear during Autodesk's cloud-based stitching process, which would explain why this opuntia came out while numerous capsicum scans did not. Lesson learned.

This statue of Omar Kayyam is located in the heart of OU's Norman campus. Fortunately, it was an overcast day when the scan was done, otherwise the direct sunlight would have reflected off the white stone. The statue is quite tall (about 8 ft.), however, so the imperfect top of this Persian polymath's cap was sliced off in post production. Diffuse light and multi-angle access are necessary for a good scan. 

As described on the spatial page, this Sheepherder's cabin represents a "field scan", whereby off-grid artifacts can be manipulated, analyzed, or otherwise investigated after the fact for details that onsite limitations (like time) simply won't allow for. Measurements, for example, can be made and recorded later, after the threat of rattlesnakes has long since passed. 

VR-based analysis of early 20th century sheepherder's ruins. Note the measurement tool. 

Combining a few best-practices gleaned from generating high-quality field scans like the sheepherder's cabin with the ability to effectively scan certain living, albeit static, organisms (plants, that is), mean that 3D asset repositories of invasive flora, or endangered orchids, or entire crops are feasible and perhaps inevitable.

Downstream analysis of these 3D assets can not only take place centrally - at the local institute of higher-ed, for example - but at the expert's leisure. Moreover, screen capture software means that new perspectives on distant/fragile/rare data-sets can be output for presentation and publication regardless of whether that perfect viewing angle was attained at the time of the scan.