StoryBook AR was a Unity-based project that was created under NJIT’s I-Corps Grant program. The idea of the project was to use augmented reality to add more interactivity to illustrated books. Toward the end of the project, I tried to make the project more of a tool for others to utilize to easily make simple AR applications on their own. I used the Vuforia package for AR support. I worked alongside my professor, Jessica Ross, as she was creating an illustrated storybook, Gulnare of Persia. StoryBook AR was an ambitious project, but the end result was only a working prototype of the initial concept. It has since been left as a proof of concept project.
StoryBook AR was heavily built upon the Vuforia SDK. At the time, this was the most accessible and strongest resource package I had access to. It initially took a little longer than intended to learn the ropes and have things set up properly, but once that was squared away, things went fairly well.
Due to time constraints between myself and Jessica, only three pages of her book were AR interactive. Working with Vuforia’s standards of what would be best recognized by the package, solid black icons were put onto the pages to be image targets, which can be seen in the above page illustrations. The app I produced was to help show step-by-step progressions or related illustrations for the page and a voice-over explaining the content. To stick with the visual theme of the book, I decided to have the images appear in a bubble. When the app scanned the image target, the bubble would float upward from the bottom of the screen, hold, and begin to show the images. After that part of the sequence was done, it would float upward and off screen. Extended tracking was included to improve tracking robustness. I added a simple particle system of bubbles in the background to keep the environment feeling lively.
The original intent was to further refine the app and build it for the Nintendo 3DS platform. However, due to some unforeseen conflicts with the grant funding, this idea was scrapped. I instead tried to modify the project to serve more as a tool for beginners to use to quickly and easily make very simplified image targeting AR applications. For this, I decided to create a handful of freebie assets to get started. I created some shaders and paired scripts with them to have them animated. Those scripts were then adjusted to make the public variables as simple as possible (animation speed, color, etc). I also wrote scripts to create animated text UI effects. All scripts were then further simplified to make the public variables easy to understand and limited as to avoid any hiccups.
I then created C# editor scripts to create a customized window to speed up progress. For any visual effects for images or other game objects, all the user had to do was drop the assets from the hierarchy into the appropriate fields and then hit the button. If the applied material was one I created, it will automatically add the associated animate script to the image/game object. I also added a “Text UI” section where all the user would have to do was put the text UI gameobject into the field. Then they would be able to toggle what text effects would be applied to the gameobject.