We are better than This

Read More

Rural? Rural enough?

Read More

It's not you, it's us (or is it?)

I continually learn things about the academic job market. I have sat on both sides of the table and also sat at the bar listening to the experiences of people who have sat on either side of the table. (The academy is a very sitting-intensive place to work!) 

Something has struck me recently about the academic job search. It's when an institution decides against an applicant under the auspices of "it's not that you aren't excellent - you just aren't a fit for us right now". 

I'm not convinced that this statement always has truth to it. Many times people will gush about how great so-and-so is, but if the opportunity were to come to hire them, they pass. It isn't all situations, but there certainly seem to be enough specific cases I can think of when the sentiment is, "that person is great for where they come from, but not really quite good enough for us."  I've seen this take place where the call for a position is quite open, so that has made it much harder to buy the "not the right fit" justification.

the upshot is that the academy is a prickly beast when it comes to hiring. But, at least one gets to spend a lot of time sitting in that business. 

Blog maintenance

So, I am not great at blog maintenance. I have a bunch of stuff saved on my computer for things I wanted to write about but haven't including:

  • A comparison of activity data obtained from the Apple Watch to data obtained from the Fitbit Surge watch (30 days of data!)
  • Commentary on the emergence of new personal informatics dashboards for mobile devices
  • The amazing opportunities for productive collaborations between information sciences and learning sciences
  • Code words used in academic job postings and the weird ways in which those words are understood
  • The social construction of prestige

None of those things have happened. I'm off sabbatical. Life keeps me busy. Perhaps another time?

Victor Lee Utah CV

The title of this post is Victor Lee Utah CV because I expect those to be the likely search terms for someone looking for my CV. Instead, they get this blog post.

I am not a big fan of posting my CV. To be clear, if my job required it or highly expected it of all faculty, then I would go ahead and post it. However, if I can get out of posting it, then I will. Why? Here are a few reasons:

  1. My CV has my home address. I could take that out, but many people do not. I do not want my home address on the internet, although I'm sure there are some weird sites that have the name 'lookup' or 'whereis' or something like that in their url that will offer to tell you where I lived/live for a nominal fee. Privacy is a weird thing, especially now. Chances are, you can find my home address and see that my lawn desperately needs mowing. However, if I can create an obstacle for that embarrassment of a lawn (and I would rather xeriscape but the cost right now is prohibitive and we hope to sell this house one day and invest in a future house) then I will leave that small obstacle.
  2. On several occasions, I have seen people attend a talk at a conference, and with their laptops open, search for [presenter name] [possibly their institutional affiliation if the name is common] and "cv". Then I see (and hear) an immediate dismissal because they did not go to a certain graduate school or did not publish in particular places or do not have famous co-authors. I know that is the heuristic that we use to make quick judgments, but I have seen it happen enough and people make judgments (and sometimes switch over to fully paying attention to their email or get up and leave a presentation) because of that. That makes me sad. I have been fortunate to have had affiliation with some places that some people respect. I have worked with some people that people respect. I am lucky in that regard. I do not want to be judged on that. I want people to pay attention to what I have to say at the time and engage with me at that moment. Maybe I am selfish, naive, or not comfortable with boasting. I just feel like there is a lot of snap judgment that is enabled by publicly posting CVs.
  3. Doesn't it seem a little weird that one's "life's work" (translation for curriculum vitae with words reversed) can be inspectable to anyone? Like if I was a famous composer and someone actually wanted to write my biography, then sure. But right now, my life's work is more than what is on my CV. My life's work is how my kids turn out, what people think of me after they talk with me, what the kids and students who interact with me in my work do and how they are impacted by the things I try to make happen. I can't list that in a publication entry or an awarded grant or in the name of a class I taught.
  4. Depending on where you are in your career, CVs say different things. I have dropped stuff I would have included when I was a graduate student. I imagine I would drop stuff I have now in 10 years. I co-authored a grant at one point that got funded, but I felt it weird to list a grant on my CV that I co-authored but for which I am not listed as a PI so I have taken it off. Recently, I was given a little research award locally which was really thoughtful and nice for the moment and situation, but is not something that I think should be on my official CV, especially if I am being examined by a national or international group of professional peers.  I want the freedom to present myself as appropriate at the time as it is called for. We are different people depending on the circumstances, and I like to make sure I can fully be whichever version of me (and they are all me, but me in sweats is a little different than me in a suit) when I feel the situation demands.
  5. Do you want to see my academic transcript too? It sort of feels like that. We have FERPA and some etiquette about talking about grades. Look, if you really want to know what grades I got in school, you can contact me directly and ask me. I'll probably tell you. I am not ashamed of my grades. (Well, there was that bout of senioritis one term where I wanted to do some other stuff and being nerdy, calculated exactly how hard I needed to work to get a grade I thought was acceptable and did exactly that because I felt I had more important fish to fry.) I just do not feel like advertising some stuff about what I have done, although I will not deny it if asked. Probably some internet snooping would show that I graduated from undergrad magna cum laude and had some academic accolades because those become newsletter things that never die on the internet. You can use your imagination for my exact GPA or what sorts of school and academic activities I did if you really want. But I don't really feel like leaving it out there for anybody to see at any time.
  6. I like being mysterious. It makes it possible for me to mess with people more. I can have a wicked sense of humor (albeit a good-natured wicked sense of humor, as oxymoronic as that might sound).

Again, I would totally make my CV public if I needed to based on what my current institution required. (My home address would disappear, though!) However, given the option, I'll just keep my CV to myself and share if asked. You are welcome to ask me for it, and I probably would share it, so long as you and I have met at some point, and I know you aren't trying to be an internet scammer.

Peace out.

Fitbit Flex vs Zip vs Jawbone Up: Kind of old but maybe interesting

Read More

Unexpected assists to the academic work life

Read More

On boasting as an academic

I value humility. I'm not sure if it's a cultural thing, but I really like and respect the humble person. I like to think we all have similar anxieties and ambitions, we put our pants on one leg at a time, and we all read our smartphones while in the bathroom, etc. At the same time, I know I work with and around amazing people - amazing in the sense that they solve hard puzzles and ask fascinating questions or are so passionate about something that they go out and try to make things happen that make life better for people or other beings. In many ways, I am in such an amazing line of work to know and be inspired by these people.

At the same time, these people (and for now, I'm talking about academics), have an incentive to brag and boast. We are supposed to communicate value to our institutions and impress our colleagues and our students. It's part of the game. Being an academic is likened to being an entrepeneur - build a brand, make strategic partnerships, try to get investments, get key staff, and sell, sell, sell, sell. So we have to boast to make those things happen right?

I've been amazingly fortunate to have received some distinctions and awards in the past. I really do not know how to respond to them besides say 'thanks' uncomfortably after I hear congratulations. Frankly, it even makes me a bit uncomfortable to write those words, and I'm really not trying to #humblebrag. But I have noticed on social media things that look like flat out boasting, and I don't care for it. When Facebook releases its dislike button, I'm not going to click on it. I'll probably click like because I'm happy that someone got some nice recognition - that's great and I want to be supportive. But at the same time, I'd rather discover it accidentally and in person, slap them on the back and buy them a drink and comfortably embarrass them about being awesome.

So is boasting okay? In front of me? I think it's fine to boast that a grant got funded or a paper is accepted or finally published. Why? Those are collective challenges we all face, so we like to see one of our own succeed. We know the gauntlet that we must all run. It gives me hope to see someone make it on the other side. However, I don't really care to see other distinctions self-promoted. Exception: I think it's fine if it gets posted by someone other than the recipient - often universities use these as bragging things for alumni and such, and that ends up being in the public domain - so what happens to that is what happens to that. I think boasting about some touching thing that happened, like a student being so grateful that they leave a note that says you changed their life and their life direction in the best way possible is fine too. That is nice because it gives us something to aspire to and also reminds us of some of the good things we try to do and how we can have an impact.

That's just me. Maybe I am not meant for PR, although I do get asked to do PR things a lot. Maybe this is just my internal backlash to knowing what you have to do in academic PR. Or I'm just a really odd duck.


Divorces and separations

No, this isn't about me. If you know me, you likely know my partner/wife (I always preferred husband and wife as terminology rather than partner - I would like to refer to same sex couples' partners as their respective husband or wife, but I abide but whatever people request of me. Let's ignore the historical heterosexism and patriarchal roles that the words husband and wife entail for now). Anyway, my wife is great - way out of my league. People who meet or know her think she is sweet, beautiful, and just great. She's fairly quiet too. And they are right. I'm not just saying that because she'll read this. Honestly, I don't think she knows I maintain a blog because most of our conversations involve "who is picking up which kid? Did you go to the store or want me to stop by and get X?".

Anyway, as a friend commented on twitter a couple of months ago, he realized that he went past the 'go to lots of weddings' period of his life and had just finished the 'go to lots of baby showers' part. He observed he is now in the 'talk friends through their divorce' phase. I'm realizing that I am entering that as well. I am not going to bother counting the relationship endings I am seeing or have seen recently. It's a good number.

I have to admit, this is very new territory for me. I did not experience divorce in the family firsthand. That's not to say things were hunky dory, but I just haven't lived through that as personal experience. I had lots of friends as a child whose parents got divorced or were divorced. I understand it really can be the best thing for the family and respect it's a private and personal decision. I did see for many of those friends that there were some wounds that lingered from that, but again, I believe that what the family needed to do was what the family needed to do.

Things I struggle with:

  • What to say to express my support to a person. I've tried lots of things, and all are some version of "I care and want to help you." It never seems to come out right.
  • Not taking sides when possible. Often I'm friends with both people in the marriage/impending divorce. However, I hear two versions of things. Objectively, some of it sounds heinous on one side - one person was clearly unfaithful or became abusive in some way. So that's definitely bad. But I'm hearing a lot of 'person X' became distant or they got seriously depressed and took it out on person Y. I've actually heard that a lot from several couples. So in those situations, I don't know what to say. I wing it. I try to be supportive, but I also want to stay out of sides.
  • Avoiding anything that sounds like advice. I haven't been there. Who am I to say what to do? Maybe I could recommend a lawyer, but I don't really know divorce lawyers or who is good or what you think about for those things. It is so difficult for me - not just emotionally, but like computationally - to think about how a home and a pair of lives will need to get split.


Even when it isn't divorce but some sort of separation, I am at a loss. Sometimes it is a trial separation. that is weird to me already. It's like trying out divorce territory but not really, so I feel extra cautious about those 3 things. Sometimes it is a necessary separation (like a job takes one spouse one place). I see this in the academy a lot. That is hard for me to imagine doing, although I would not doubt that I may have to do that at some point in my career just to make sure kids can finish school years while I or my wife starts a new job somewhere else. It may not be playing with fire, but it seems like it can be like buying a box of matches.

But on the topic of the separation leading to divorce or all that - The whole thing makes me feel a little sad because I was often at the wedding or on a group date or at a baby shower or whatever. I remember they used to be happy. I remember the vows and the toasts and how beautiful the bride looked or how the groom's eyes lit up upon seeing his partner to be at the other end of the aisle. I don't cry at weddings. I appreciate them. Partly because open bars and fancy appetizers are awesome.

Anyway, a little food for thought I had that has been sitting for the past week. It's not really work related, but this is a blog. I guess it's what you do on blogs - write something out into the ether, and maybe a nice spam bot finds it.

On academic publishing

I don't know who is going to find this blog besides spam-bots, but assuming some academic-y folks do, I thought I'd post on academic publishing. Specifically, I'll post on academic publishing in education (and my corner of it, which is learning sciences). I'm inspired partly because these were largely tacit rules I learned over time and also because I just saw this circulating: http://cacm.acm.org/magazines/2015/9/191173-should-conferences-meet-journals-and-where/fulltext

For good or ill, publishing is our bread and butter. It's how we are judged as being valuable as researchers (although if you keep generating big money grants and don't really publish much, I think your university or institution might see the value of having you). Let's ignore how the "publish or perish" mentality means too much stuff of "meh" quality gets written just so we can report large numbers of pubs to our institutions. It's a problem, but not one I want to talk about here. 

In a nutshell, goodness of publication venue is a judgment we should defer to the experts in that academic community. They know what is good and why. There are journals that are highly ranked that have the boring same old same old and, frankly, doesn't advance our knowledge in a meaningful way. Also, I say this deference to people in the field is important because I hear time and again that fields like Computer Science are penalized at the university level because the publications expected in the CS community are conference proceedings but institutions want journal articles. Hey, CS conferences are no cakewalk! Their acceptance rates are tougher than a lot of journals. If I'm evaluating a job or tenure candidate's CV based on CS conferences, color me impressed if I see regular papers in well known and top conferences. My field is, I believe, getting to a point that conference proceedings papers have some real oomph to them. Conferences with published proceedings also have a quicker turnaround, so the inherent delays in the journal publishing system are mitigated. We actually learn what is going on in the field much sooner rather than reading really interesting stuff on data collected in 2005 (seriously, I'm pretty sure I just reviewed a journal manuscript like that.) (I think it was fine, but I have been reviewing a lot of papers and can't remember each and every one after I submit my review)

So here are my opinions, influenced by my field, of publication types - probably aimed toward junior faculty, and subject to dissent from others: 

  •  Journal articles  are still considered the gold standard and probably because nearly all fields have them and have to go through the same sort of review process. Impact factors and official rankings help for letting people outside your field know it's a respected journal (although talking about impact factor at a cocktail party of non-academics isn't going to win over any new friends) but people inside your field know the true good journals. (defer to the specific community!) Good journals may not have high impact numbers, but impact factor is for show. If someone tries to impress me by mentioning the impact factor of a journal they published in, they fail. Frankly, an article in a lesser known journal that is good and gets read and cited is plenty good IMHO. Just don't do that for all your papers. And journals that are crap and will publish what you write as long as you pay publication fees reflect terribly on the scholar. I actually think it is okay to have possibly one or two articles in unimpressive journals as long as they have a respectable process and the paper is interesting. While a journal exists for nearly every topic, there are some things you may write about that just don't fit a known journal. That's fine. But, it's not good to have half or most of your articles in those. And special issues, I think, are frankly great. Special issues are getting more popular because people read them. So fine by me if you publish in a special issue. Often, it means you are respected enough in the sub area to be part of the club, and the club is big enough to populate an entire journal issue. 
  • Book chapters:  good to have some in books with respected publishers, good to have them with a well known editor. Bad to have them with unknown publishers, but again, an occasional piece in an unknown isn't awful so long as it is an interesting chapter that people will choose to read. I like book chapters because you often see some pretty bold assertions made that would often get sanitized in journal peer review. Doesn't mean the assertions are right, but they can spark interesting ideas and new directions. They can be really fun to write! In fact, I expect solid folks to have at least one or a couple showing they are respected enough to get invited. 
  • Book reviews:  fine for grad student, not for faculty. The only exception is if you are famous and are reviewing someone else who is famous. Then people want to see what you have to say. Because we are nosy.  
  • Encyclopedia entries:  depends on the publisher and editor. Obviously not where you want to put all your efforts, but if it has big names and you are invited, that's a privilege. No one is going to cite your entry but people may read it. This is, like book chapters, a situation where you are judged by the company you keep.
  • Published conference proceedings papers:   Great if it is from a respected and competitive conference that makes its proceedings accessible. People will cite published conference papers! Good also if it is in a CS conference or a close cousin of a CS conference. 
  •  Conference papers without proceedings  you have to have a steady stream of these, but no one but you is going to ever cite it. Good to have for known conferences and good to have lots of them. Doesn't give you any special perks - it's just the price of admission. And we are assuming these are peer reviewed, rather than "pay and you can present"
  • Authored Book:   not worth it unless it is with a well known press, it will have wide distribution, and is on a specialized topic that doesn't have books and is something lots of people really care about even if they aren't academics. This is time sensitive - a topic that used to be in vogue but isn't now is not a good use of time. Gotta be ahead of the curve and gotta be right about being ahead of the curve! And it doesn't replace journal articles, or even published conference proceedings papers, IMO. In general, not worth the time for someone young, but there are exceptions.

  • Edited Book:   icing on cake, but it isn't the cake. Mostly shows you can coordinate a big writing project. A lot of the respectability of the book rides on the topic, press, and the quality of contributors.  (Note: I did an edited book and I am proud of it, but I think it's because I just wanted to do it and would not be stopped. It's definitely not for the money - I could make more money at a neighborhood poker night, and I am not a great poker player. I could make more money holding a yard sale. I could make more money if I got to search the couch cushions of every house on my street, etc)

  • Popular press articles:   better be a really popular press, like the New York Times or Time Magazine or a very popular blog (i.e. Not this one!) 
  • Practitioner articles:  lots of us want to improve practice and these things might actually get read by a teacher. So one or two are okay but should be in addition to rather than instead of the other stuff deemed important.  These do get read and cited, but the academy is like that gymnast from the Olympics who made that face that went viral: not impressed. 
  • Textbooks  : nope, bad idea, especially for noobs.   

well, those are my piecemeal thoughts. Someone who reads this is likely to disagree with me, but in 2015 and in the little corner of education research I live in, it's what I am noticing. I may update this all in the future.  I am pretty sure the fonts are screwy so I should update it just to fix that. 

Update: I know html but don't have the patience to dig around in the code and fix the formatting that this web service thing hides from me. So shifting fonts it is.

Building capacity in cyberlearning around youth and data science

Some time ago, without us really noticing, some amazing colleagues and I were awarded a grant from the National Science Foundation Cyberlearning program to bring together some great minds to figure out new frameworks for connecting youth and data science with today's rapidly changing technology infrastructure. Short version: We think that youth are going to benefit from learning with and about data, but we need some updating of our ideas about what that will look like. It looks to be an exciting new opportunity and an excuse for me to work with really sharp people. What is especially cool about this is that the idea from this came from a roundtable brainstorm involving the four of us at a Cyberlearning meeting that was intended to foster collaborations and address needs in the emerging space of cyberlearning. Going to those meetings and putting heads together paid off!

Okay, back to what I was doing, which was probably some sort of work-related writing.

The fitbit stinks, but I miss it

Last post was comparing some data on the Fitbit Charge HR and the Apple Watch with a little bit of Withings app iOS tracking in for good measure. Since that time, I have been relying primarily on the Apple Watch. Why? Because I paid much more for it. On top of that, I needed to use the Fitbit I was wearing for research purposes, and I try to be very cheap with my research expenses and use whatever equipment I have on hand (or on wrist). So the Fitbit Charge HR that I use most often is currently living elsewhere and hopefully generating interesting data for some of the research I do with kids (self promotion: see out of date publications page).

Also, the Fitbit stunk. Literally. I know this is a known issue, but my wrist and the device itself smelled bad. I stopped wearing it at night so it could 'air out'. I've tried cleaning it with rubbing alcohol with a cotton swab and various other things. The result: a bracelet that smells like alcohol and still slightly stinks. My father, to whom I had sent a Fitbit Charge for a past parent appreciation holiday, called me once to ask about how to deal with the smell.

I don't think it's too much to ask for a device that is easier to clean or more smell resistant.  I was getting pretty tempted to start Febrezing that thing. I gave it a super duper aggressive cleaning and a period of no skin contact for cleaning before we got it circulating in research again, so hopefully the gross factor decreased some.

With all that said, I have to say that I really prefer the Fitbit for activity tracking. It's been striking to me how much less often I check my stats on my Apple Watch. We've ascertained from the last post that it does a reasonable job tracking, albeit it undercounts some. But consider the difference here and understand why I miss wearing the Fitbit.

To check steps on the Fitbit:

  •  I raise my wrist. With the latest Fitbit firmware, the accelerometer can tell my arm has gone up and it displays the time (way to copy Apple on that one)
  • I press the button on the side once, and I see my steps.

To check steps on the Apple Watch:

  • I raise my wrist. A half second later, the watch face loads and then my nice jellyfish motion show begins.
  • I actually want to check steps, so I swipe upwards to go to my shortcuts. I slide leftward from the heart rate screen, which I guess was the one I checked last, and realize I am going the wrong direction becauseI go to the settings shortcut which is the left edge of the options. I swipe 3 times the other way and go to the weird bullseye concentric ring thing that tells me I haven't exercise enough today (Stop shaming me, you expensive but beautiful device!)
  • I press on the bullseye and it takes me to the watch app for the bullseye (the activity tracker).
  • I then swipe upward and get a display of steps, calories, and distance. 
  • Phew! luckily I did this quickly enough before the display went to sleep!

OR, I could have done this:

  • I raise my wrist. A half second later, the watch face loads and then my nice jellyfish motion show begins.
  •  I press the crown. It takes me to the weird fisheye display of all the installed watch apps. It's kind of creepy. There was that internet thing about surfaces with lots of tiny holes on them that freaked people out. It reminds me of that. It also reminds me of the second boss in the old Sega game Altered Beast that had a bunch of eyeballs and would fire those at you. Of course, the beast power was flying dragon and you just go up to the eyeball monster and go all Blanka-electric shock over and over and over again and that becomes the shortest and easiest boss fight ever since King Hippo in Mike Tyson's Punch Out.
  •  I scan the many eyeball buttons, and while I ought to remember that the activity tracker is just to the bottom left of the watch face app by default, I forget and spend precious milliseconds looking (and also getting confused by the app that you can use a timer to track a specific exercise - but use a completely separate app to do so). I press it.
  • I then swipe upward and get a display of steps, calories, and distance. 
  • Phew! luckily I did this quickly enough before the display went to sleep!

The number of steps involved are much higher in both Apple Watch scenarios. That's pretty annoying. Even one extra step is annoying, let along 3 or more. (Yes, I know being grateful for avoiding display shut off is not a step, but I am trying to work against the clock). People who think about user experience know even 1 or 2 extra steps can be a deal breaker. We just do not have the patience for those steps, and the opportunity for error goes up really quickly.

Don Norman, author of required reading in many design and HCI courses and current head of the UCSD Design Lab, wrote a little rant about how Apple lost its user experience way. He is right. This experience, while pretty to look at and neat to have possible, is pretty annoying compared to 'press the button once' on the Fitbit. But what do I know? And why do I stick with the Apple Watch? (oh, right, cause I paid a lot of money for it, and the Fitbit is being used by someone else for work, and it makes my wrist smell bad - market opportunity for teeny tiny sticks of deodorant, anyone?).

PS - I know there is a watch face that has a one touch access to the steps, calories, distance screen. The problem is that it is ugly, and it does not have jellyfish. This is the problem with the limited watch faces.  

I'll consider posting about something besides a wearable doohickey soon. As a parting image, here is that eyeball monster from Altered Beast I was talking about.

this image came from a website dedicated to nut shots (which is a strange thing that to have, but having stuff like this was the entire point of creating the Internet, right?) http://www.sodahead.com/fun/best-nut-shot/question-3251747/

this image came from a website dedicated to nut shots (which is a strange thing that to have, but having stuff like this was the entire point of creating the Internet, right?)


Comparing Step Counts: Apple Watch, Fitbit Charge HR, and iOS Withings App

For the past several weeks, I have been wearing an Apple Watch (nerd bling!), a Fitbit Charge HR, and not exactly wearing but still carrying my mobile phone like all the other technology addicted almost-middle aged people out there. The phone has the Withings app on it because I track my weight so that I can cycle back and forth between exuberance (Alright! I lost 5 pounds!) to bummed out (And I just gained 7 pounds. PBBBBBBT!!!!). The Withings App tracks steps based on the M7 chip inside of my iPhone 5S (why yes, I give Apple a lot of my money, thanks for asking). So I wanted to see how the three compared to each other. This is a casual comparison - I'm not running statistical tests or anything this time around.

The ground rules

After getting my Apple Watch, I decided to do this comparison thing. I was going to wear both the Fitbit and the Watch each day as I normally would. What's normal? When I wake up in the morning, I put on each device. The Watch goes on my non dominant (left) arm. The fitbit goes on my dominant (right) arm. I wasn't picky about which went on first - usually, it was whichever I could fumble and grab first. I decided some time ago (about when I discovered my wrist was getting sort of smelly from the fitbit charge) that I was not going to wear any devices at night nor track my sleep. (Yeah, bad self tracker, I know.) I set the recording of each device to the correct arm according to their corresponding apps. I carried my phone with me as I would normally - which means it goes with me to breakfast and goes with me when I walk the dog in the morning. When I get to work, it stays with me for the most part, but I will forget and leave it on my desk. At the gym, it sometimes goes with me onto a treadmill but it might also linger in the locker. If I am going to be going into water, like the pool or the beach or into the shower, I take off all devices and keep the phone away from water. For the most part, I went about my life.

Normal life involves walking the above mentioned dog, going on outings with the family, doing simple social activities with friends, and working at a job that involves a lot of sitting near a computer and cursing at the photocopier. I occasionally forget my phone for some reason, panic, and then gradually accept that I am phoneless for several hours. That is normal, and I did not track what days I left my phone somewhere. That is just life. I was trying to run an experiment that was fairly true to my life.

For days tracked, I picked the window between 6/3/2015 and 7/21/2015. Are these special days? An anniversary or obscure holiday? No. I got the watch on the 2nd, but it was late in the day so I didn't have a full day of data on that whereas the other devices/app had more time to be attached to my person. I actually wanted to go all the way through July, but my phone had been having issues and I had to get it replaced (hooray applecare! and yes, more money to Apple for applecare), so I lost Withings step data after the 21st. That's why we end then.

Getting Data Out

QS Labs was kind enough to release an iOS app called QS Access that let me get a .csv file of my Apple Health data. (Note: when I peak at the data in the health app, it seems that some of the data points, like a span of a minute or two, is from the phone rather than the watch? But I didn't care enough to dig into it and assume this is apple being smart about getting a more thorough picture. I'm sure I can read some message board and get a lot more details, but I'm just calling data from the health app  "watch data" even though it isn't 100% true).

Fibit data I grabbed from the dashboard of Fitbit. I know there are hacks to grab data - in fact, I'm associated with one of them that will grab it in minute increments - but I just wanted to compare daily totals. Maybe one day in the future I will look at minute by minute or hour by hour or some other increment to see if there is something cool going on.

Withings lets you just export .csv files from their web interface, so that was easy enough. Just a few clicks here and there. Then stick it all into a spreadsheet and make a few plots. Again, I'm being lazy about this and am not running any serious statistics. I have had the most experience with Fitbit devices (I once tested the zip against the flex and saw the flex undercounted quite a bit relative to the zip), so I figured I'd use the fitbit devices as a baseline. Anyway, here are the results.

Result 1: Fitbit Charge HR tends to count more steps

If you take into account my previous experience showing that the flex seemed to undercount relative to the hip based clip on zip (which is more similar to research grade pedometers that exercise science people use), then we might assume that the Charge HR undercounts relative to whatever is my true number of steps. (Also, published research suggests as much). No matter though - I'd rather have an undercount than overcount so that I push myself a little more. But what is interesting is that the Charge HR, assuming it was undercounting, was counting still more steps than the other two. See the poorly labeled graph that has not been cropped below.

You would think I could keep a legend, but no. The Green is Fitbit Charge HR, Blue is Apple Watch, and Yellow is the Withings app.

You would think I could keep a legend, but no. The Green is Fitbit Charge HR, Blue is Apple Watch, and Yellow is the Withings app.

The Apple Watch and the Charge track pretty close, but the green always seems to be a little higher. If you want to see how much of a deviation there was each day - in decimal approximations, then just keep scrolling.

This feller shows how much the Apple Watch count deviated from what the Charge HR recorded. The decimals correspond to percentages, but I just didn't feel like actually making it show as a percentage. Anyway, the shorter the bar, the closer the numbers were. Bars that point downwards are undercounts. Bars that point upwards are overcounts. You can see that it's pretty close with 4 days that undercounted more than 20%. The overcoats were on 6 days, and only one of those was over 20% off.

This feller shows how much the Apple Watch count deviated from what the Charge HR recorded. The decimals correspond to percentages, but I just didn't feel like actually making it show as a percentage. Anyway, the shorter the bar, the closer the numbers were. Bars that point downwards are undercounts. Bars that point upwards are overcounts. You can see that it's pretty close with 4 days that undercounted more than 20%. The overcoats were on 6 days, and only one of those was over 20% off.

So the conclusion here is that a Fitbit Charge HR on the dominant hand seems to count more steps than an Apple Watch on a non-dominant hand. Presumably, the algorithms that they use in each would account for dominant/non-dominant. And for the days that the counting was way off, I'm willing to believe something dumb happened that I don't remember, like a battery dying. (It happens.)

If you want to see it as being like a correlation, here's that plot.

The numbers on the axes are step counts. Probably should have mentioned that sooner, but you probably figured it out, right? If not, sorry. The very first graph will make more sense now. Like I said, poorly labeled!

The numbers on the axes are step counts. Probably should have mentioned that sooner, but you probably figured it out, right? If not, sorry. The very first graph will make more sense now. Like I said, poorly labeled!

If you ever took a statistics course, you probably covered correlation. You probably saw the pictures of magical correlation of +1 or -1 (everything lies perfectly on a line of slope +1 or -1) and a correlation of zero (which, if memory serves, was always like a perfect dot filled circle in the textbooks. But let's not start on that - there is a cottage industry of ranting against textbooks and  I even did a chapter of my dissertation on that which eventually became a journal article.) Anyway, this has an upward line that looks +1ish, although there are some points hovering above the line (Apple undercounts relative to Fitbit). I didn't want to run any statistics, and a correlation barely counts because it's like a click, drag, and button click in your favorite spreadsheet program - so here it is. For these two, r = 0.81. That seems like a pretty high correlation. It's not 1, but getting a 1 is pretty darn hard.

Okay, how about the Withings? Again, relative to Fitbit's thing, we get the following deviation plot.

The bars are red because they are like tears of blood. Sometimes it overcoats (like 9 days?) and the rest of the time it undercounts. But when it undercounts, IT REALLY UNDERCOUNTS. There are a lot more days that are more than 20% off. And not like a sale price 20% off. It's like "WHOOPS!"

The bars are red because they are like tears of blood. Sometimes it overcoats (like 9 days?) and the rest of the time it undercounts. But when it undercounts, IT REALLY UNDERCOUNTS. There are a lot more days that are more than 20% off. And not like a sale price 20% off. It's like "WHOOPS!"

So it turns out that forgetting the phone at home or in the car or on my desk makes it a bad step counter. Who would have thunk?! But still, assuming I'm good about carrying my phone with me on the days that don't have the super sad droopy red bars, it's still a pretty big deviation.

Some time ago, there was a hubbub about some study about smart phones being reasonably good compared to a wearable tracker. Of course, the news media went to town with it and said wearable trackers are the worst thing ever and so take that you annoying techno-posers! But the study actually said that smart phones are a reasonable approximation when you have people doing something like....walking on a treadmill for a science experiment. If you don't have or don't want a wearable device but want to track, then by all means, use your phone! (Just don't forget it at home or in the car or on your desk). But if you want numbers that are a little...um, higher? Then a wearable that you don't have to think about leaving in the car or on your desk is fine. At least it works with this guy who has two thumbs and devices on each arm.

Correlation now? Here it is for the Fitbit vs. the phone app.

Just to confuse you, the Fitbit Charge HR steps are on the x-axis instead of the y-axis.

Just to confuse you, the Fitbit Charge HR steps are on the x-axis instead of the y-axis.

Well, that plot is sort of line like - it looks more like a line than a dotted circle. Heck, if I was a sociologist and got this as my plot, I'd probably start doing the Mipos Dance of Joy! (I'd also try to hide the fact that my N is so small). But doing that click, drag, button trick gives me this: r = 0.60.

That's not terrible. It is certainly a good approximation, and I am saying this without actually looking at any data but imagine this is probably close to the correlation of height and weight of american adults of a certain age. So it isn't bad - you can get reasonably close (if you remember to carry the darn smartphone everywhere, which we think we are good at, but I can't even tell you the number of times the words "Do you know where I put my phone?" are uttered each day in my house.)

What's that, you want one more correlation? For fun? Okay. The remaining pairwise correlation.

I moved the Apple Watch counts back to the x-axis just to be annoying.

I moved the Apple Watch counts back to the x-axis just to be annoying.

This last scatter plot looks sort of like a dragon? Probably because it is green, but I can kind of make out a neck and a tail. It seems the least pretty to me, but since Game of Thrones became a thing, my relationship to dragons has changed. I'd still be delighted if I were a sociologist (and ashamed of the N - let's not forget the shame). R = 0.82. So that's not bad if we use the Watch as our baseline? But like I said before, it seemed like the Watch might get some of its step counts from phone accelerometer motion, so that could be part of what's going on.

Parting thoughts

Well, I'm happy to use all three device options (especially because the Watch is useless without the phone) even though I'd get double the wristband tan. However, I've scaled down to one for now (the Watch - Apple nerd bling makes me care more about things that do not relate to data, like overpriced beauty). My hunch is that the Fitbit does a better job of getting closer to "REAL" steps, but we are a long way from any wrist-based or handheld device being able to getting to that level of accuracy. I could elaborate on that hunch, but I won't right now.

And I know there are a bunch of different analyses I could and should do - average counts, significance tests (okay fine, paired t-tests are all significant at the 0.01 level), weekends vs. weekdays, excluding some outlier days, etc.). But if you want my tl;dr (which you wouldn't know about because I put it at the bottom and if you dr'd it, you wouldn't know it's here), it is that phone apps don't seem as reliable or consistent compared to a wearable device when situated in the actual world of human use, as determined from this human's use. Beyond that, pick your poison. More steps can come out of the Fitbit wrist-worn device, which makes it an attractive option. That is, until you make and sell something with the Apple logo on it. Then all bets are off.

Oh, and my team published a case study of a tween girl who came to a similar conclusion that you can read here. And the whole question of accuracy is something that a bunch of elementary school kids explored too. We published that also, and one day it will be freely available.

Beginning a blog, or the beginning, middle, and end of a false start.

Does the world need a blog from me? Probably not. But this site, which I have created so as to make sure that I have an enduring web presence (because my university is like most others and changes web platforms about every 18 months and makes content cumbersome to change) seems blog-appropriate. So, ta-da! Here is a blog.

I can't promise it will be updated regularly, and I can't promise that it will be very interesting. It will tend toward the professional side and focus on academia, education, and technology. My goal is to put out "interesting" stuff, and maybe someone will read it. At the very least, some spam-bot will find it and email me with offers of analytics services or invites to conferences that don't really exist. And maybe some people will start trolling me, making me think I must have made it big. Or it will just occupy unviewed bits of cyberspace and eventually disappear when I forget to renew the domain name, which will inevitably happen in the future.