Making Google Analytics and jQuery Mobile Work Together

The jQuery Mobile framework is excellent for many things, but its single page architecture can be hard to track through analytics packages. This post will show you how to make some tweaks to your Google Analytics insert code to help it gather statistics from jQuery Mobile.

A little more background on why this is necessary… Within jQuery Mobile each page view is just a specific snippet of HTML on a single HTML page pulled into the browser view by Javascript. Most analytics packages treat these visits as a single page load and your page view statistics end up looking like, well, not much as the only page recorded is the source page – index.html. Not very helpful in terms of analysis… Google Analytics requires a few tweaks to be able to record these Javascript-generated page views. Here are the steps to make that happen.

Step 1: Include the asynchronous Google Analytics script loader between the <head> tags in your jQuery Mobile index page.

<script type="text/javascript">
 var _gaq = _gaq || [];
 _gaq.push(['_setAccount', 'UA-XXXXXXXX-XX']);
 _gaq.push(['_gat._anonymizeIp']);
(function() {
  var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
  ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
  var s = document.getElementsByTagName('script')[0];
  s.parentNode.insertBefore(ga, s);
 })();
</script>

All we have done here is bring the Google Analytics Javascript tracking code onto the page. You will need to specify the Google Profile Account ID you received when setting up your Google Analytics web property to identify the web site/app you want to track. (Add that info to the “UA-XXXXXXXX-XX” section above.) We included the asynchronous loader method as it lets other parts of the page load as connections become available and helps with performance over spotty mobile networks (or dodgy networks anywhere…).

Step 2: Using the jQuery base library script and its native functions, add a “track page views” event right before the closing of the <body> tag in your jQuery Mobile index page.

<script type="text/javascript">
 $('[data-role=page]').on('pageshow', function (event, ui) {
  try {
    _gaq.push( ['_trackPageview', event.target.id] );
    console.log(event.target.id);
  } catch(err) {
  }
});
</script>

In the routine above, we set up the “pageshow” event to make a note of any action on the page and we are binding that event to the Google Analytics _trackPageview() method on every page load to allow for a recording of any live action on the page. With the line – “_gaq.push( [‘_trackPageview’, event.target.id] );” – we are telling Google Analytics to record the specific page id in the HTML into our analytics data. With these changes, the logs start to look more familiar and each snippet of HTML that has been visited will be part of the log record.

screenshot showing Google Analytics after applying jQuery Mobile modifications

[Figure 1: Screenshot showing Google Analytics recording each page view in a jQuery Mobile app]

A final note: Under this method of tracking, the way you identify your individual pages within jQuery Mobile is important. Use markup and names that will make sense when you see them in your Google Analytics logs (or any logs for that matter…). Some sample markup:

<div data-role="page" id="search" data-theme="d">

Because the page id has been set up intuitively, I will be able to look for “search” in my Google Analytics reports and get specific tracking data about that particular “page”.

To see a full HTML example with all of the above code in context, visit http://www.lib.montana.edu/~jason/files/touch-jquery/ and “view source”.


Mapping your location with Google Static Maps API (tutorial)

The Google Static Maps API  (code.google.com/apis/maps/documentation/staticmaps/) allows for a dead simple way to embed maps showing a specific location for mobile and desktop environments. The real advantage of the Google Static Maps API is that the embed comes in the form of an HTML image <img> tag. In this quick codelab tutorial, I’m going to show you how to create a static map and set a location using the Google Static Maps API. If you have a familiarity with HTML, the markup for the image should be familiar.

<img src=”http://maps.google.com/maps/api/staticmap?center=Bozeman,MT&zoom=13&size=310×310&markers=color:blue|45.666671,-111.04859&mobile=true&sensor=false” />

As I mentioned earlier, it is a simple <img> tag with a call linking into the Google Static Maps API. In order to assign a specific location we need to pass a few parameters to the Google Static Maps API. Included among the customizations:

  1. A central location for the map
  2. A zoom level for the map
  3. A size for our map image
  4. A marker color and position using latitude and longitude
  5. Set the map to display for mobile settings without a sensor

The necessary pieces to enter are a central location and your specific latitude and longitude. Enter your hometown for the central location in the format of {city, state or principality}. To get your latitude and longitude, visit gmaps-samples-v3.googlecode.com/svn/trunk/geocoder/getlatlng.html and type in your library address. Copy the latitude and longitude values that are returned. Enter these position values as a comma separated string {latitude,longitude}.

Here’s the image src= markup with cues for where to add your values.

http://maps.google.com/maps/api/staticmap?center={YOURHOMETOWN}&zoom=13&size=310×310&markers=color:blue|{YOUR LATITUDE/LONGITUDE}&mobile=true&sensor=false

Once you have the link filled out add it to your image <img> tag on a live HTML page and you will have a local map to your library or any other specific location.

msu library bozeman mt


Unix Global Find and Replace (using find, grep, and sed)

Finding and replacing strings and characters can be a dicey operation for a web developer. Too much can lead to breaking your web site. Too little can lead to missing that piece of HTML needing to be updated or deleted. Many tools exist that can help you in the process – Dreamweaver, Homesite, and many text editors have powerful interfaces for searching and replacing. (E.g., Use cntrl-f on windows or cmd-f on mac to see the Find and Replace window in Dreamweaver.) But there’s some really great functionality on the command line for Unix/Linux users that shouldn’t be overlooked. I’ve been experimenting with a procedure for making these global matches and replaces within the Unix shell environment and I wanted to document the process somewhere. This seems like as good a place as any…

Important: All the commands below must be run from the shell environment on a Unix or Linux system. If you aren’t sure what I’m talking about, check the wikipedia reference for shell.

Step 1: Find the pattern needing to be replaced or updated, print out files needing change

find . -exec grep ‘ENTER STRING OR TEXT TO SEARCH FOR’ ‘{}’ \; -print

*Note: I’m using the “find” and “grep” commands to search for a matching pattern which will print out a list of files and directories that need changes. If I’m at the top level of my web site the “.” in the find command will search for the pattern down through any directories below. On a large site, the process can take some time.

Step 2: Move/copy files into test directory to test expression; preserve owners, groups, timestamps

cp -p -r test test-backup

*Note: These directories would be named according to the directories or files you need to change based on the results from Step 1. The “-p” will preserve owners groups and timestamps in the copied directory. The “-r” will copy recursively down through any associated sub-directories. I do this so I can compare the new directory to the original directory after I’ve test run the global changes.

Step 3: Run test on find and replace expression in /test-backup/ directory

find . \( -name “*.php” -or -name “*.html” \) | xargs grep -l ‘ENTER STRING OR TEXT TO SEARCH FOR’ | xargs sed -i -e ‘s/ENTER OLD STRING OR TEXT TO REPLACE/ENTER REPLACEMENT STRING OR TEXT/g

*Note: I’m using the “find .” command to search for .php and html files in the current working directory as I only want to target the files that need to be touched (you should change according to your requirements), next I’m piping that result to the “grep” command which searches for the string or text specified and holds only the matched files in memory, and finally I’m passing the grep result to the “sed” command which matches the string or text and replaces it with the new string or text value.

Step 4: Test files and applications to ensure changes didn’t break functionality and that owners, groups, timestamps were preserved

Step 5: After testing, run expression from Step 3 in ALL the directories or files needing the changes. Delete /test-backup/ directory

Steps 1 and 3 are the heart of the matter. I’m learning the power of these commands, so I’m pretty cautious about backing up and testing on directories and files that aren’t live. Once I have the expression dialed in, I’ll run it on a more global scale. So, there you have it – my find and replace process in a nutshell. Use at your own discretion and feel free to share your thoughts in the comments.


Ajax Workshop – Internet Librarian 2007

My preconference,”AJAX for Libraries”, with Karen Coombs went really well. It’s always great working with Karen. She’s cool, composed, and “wicked” knowledgeable. For the second year in a row, we had a great group of participants. It’s nice to see a growing interest in emerging web programming frameworks and how they might be applied to libraries.

Here are the updated slides and links:

“AJAX for Libraries” Presentation
“AJAX for Libraries” Handout (Code Samples and Explanations)
“AJAX for Libraries” Code Downloads

And some additional examples of libraries using AJAX:


What are your patrons doing online?

BusinessWeek has a quick snapshot of U.S. user activities online broken down by task and ages. (The study was conducted by the Forrester Research group.)

Who Participates And What People Are Doing Online

Source: http://www.businessweek.com/magazine/content/07_24/b4038405.htm.

Good news if you are into the web2.o stuff and you work in college libraries. The 18-21 set and 22-26 set are well represented in most task categories. And how about those task categories: Creator, Collector, Spectator, Joiner, Critic, etc. They help to define the possible tasks of web2.0 users. Granted the activities recorded refer to patterns in the web at large, but it gives libraries some guidance as to what our users are doing online. It’s also interesting to note the “Inactives”. There’s a whole population that doesn’t live and breathe the web. I get caught up in the “everybody’s online” thing, so it’s a nice gentle reminder.

The key is honing in on one or two likely applications for your library community and giving them a go. Want to enable the Collectors? Think about an XML feed to pieces of your digital collections. Or what about building a service for the Critics? Build a tagging or a comment/rating system into the catalog or digital collection.

An admission: it’s all fine and good for the student set, but college libraries have other interested parties like faculty and university admin. Part of the library role might be to “coach” these other parties into possible roles they might take up in their web use. A faculty member trying to build a research bibliography seems like the perfect candidate to become a “Collector” once given the right library tool or app. In this setting, outreach and education are still a library web developer’s best friend.


“TERRA: The Nature of our World” gets Webby Nomination

For the second time in as many months, “TERRA: The Nature of our World” (http://lifeonterra.com) has been noticed by the web critics. This time it’s for the Webbys which have been called “the Oscars of the Internet” and with judges like Beck and David Bowie it’s definitely got some celeb street cred. (The full MSU news story is available at http://www.montana.edu/cpa/news/nwview.php?article=4822.)

Once again, anyone has the opportunity to vote. TERRA’s nomination can be found in the student category of the Online Film and Video section at http://pv.webbyawards.com. Votes will be accepted through Friday, April 27 and winners announced May 1.

But the voting is not really the point of the post… I love that digital library initiatives was a cornerstone in this effort. There’s an opportunity here for all diginit folk. Find your niche as a content manager and provider. My “in” was metadata, creating XML for syndication, relational database design, a little PHP magic, and creating a search backend. The key was answering the need for a content manager and developing relationships towards that end. Think about what could be your “in”.


“TERRA: The Nature of our World” gets South by Southwest (SXSW) Nomination

I’ve mentioned the TERRA group project in previous posts. Earlier this month, I received some great news regarding the project. “TERRA: The Nature of our World” was nominated in the student/university website category by SXSW interactive Web Awards.

TERRA is a partnership between Montana PBS, The Media/Theatre Arts Department at Montana State University, Montana State University libraries, and various independent filmmakers. Montana State University libraries was brought in to build/code the site and content management (metadata, data preservation architecture…) The site was designed with a nod to the future of digital libraries. It’s a digital video library with commenting, ratings, tags AND a controlled vocabulary. And it’s all wrapped up with some AJAX functionality and a Dublin Core/OAI metadata backend. The TERRA group also experimented with syndicating our content as podcasts. You can actually search iTunes for TERRA and receive our podcasts. That’s powerful stuff and the reach of the site has been amazing. Just last week, TERRA podcasts were placed as default content in the download for democracy player, effectively doubling the TERRA audience in one fell swoop. More and more, I see leveraging these type of communities as the future of library content distribution.

As for the nomination, I am honored and just a little surprised. SXSW is the center of the web geek world and to even be considered is quite humbling. We’ve got some university press lately, but I didn’t see it going a lot further. (Check the MSU news release for complete details.) So, I’m heading to Austin, TX with a colleague in March. I’m excited to step off the library circuit and see how the other half lives. Stay tuned for SXSW updates.

And if the mood should strike you, vote for “TERRA: The Nature of our World” in the People’s Choice Award race at https://secure.sxsw.com/peoples_choice/.


Follow

Get every new post delivered to your Inbox.