Browsed by
Category: General

Hello World!

Hello World!

This is my first blog entry!

I plan to use this blog to post my data science codes, exercises, projects, and ideas from time to time. The languages I use include Python, R, and MATLAB.

It is still a work in progress and I hope these personal notes can benefit other data scientists somehow.

How to Download Your Fitbit Second-Level Data Without Coding

How to Download Your Fitbit Second-Level Data Without Coding

If you are a Fitbit user who wants to save a copy of Fitbit data on your computer but doesn’t have advanced programming skills , this tutorial is right for you! You don’t need to do any coding at all to save your second-level data. I have been struggling with getting all the so-called ‘intraday data’ for quite a while. I have found many useful resources online, for example, Paul’s tutorial,  Collin Chaffin’s Powershell module, and the Fitbit-Python API, but they are somehow complicated and I just could not make any of these working for me smoothly.  Recently I finally figured out a way to download these Fitbit data without any coding. Are you ready?

Step 1: Register an app on

First you need to register an account on, and then click ‘MANAGE YOUR APPS’ on the top right area. Next you need to click ‘Register a new app’ button at the top right area to start with. Well, this step is simple, just make sure the OAuth 2.0  Application Type is set to ‘Personal’, and the Callback URL is complete – including the ‘http://’ part and also a ‘/’ at the end. Here I used ‘’ as an example. I have used this blog’s URL ‘’ which worked just fine too.


After you click the red ‘Register’ button, you will be able to see the credentials for the app you just registered. The ‘OAuth 2.0 Client ID’ and ‘Client Secret’ will be used in the next step.



Step 2: Use the OAuth 2.0 tutorial page

Next you need to right click the ‘OAuth 2.0 tutorial page’ link and open it in a new tab, so that you can look back at your app’s credentials easily. Make sure the ‘Implicit Grant Flow’ is chosen instead of ‘Authorization Code Flow’ – this will make things much easier! After you copy/paste the ‘Client ID’ and ‘Client Secret’ into the blanks and put in the Redirect URI, click the auto-generated link.


Then you will see the confirmation page. Just click ‘Allow’.


And you will be led to the ‘Redirect URI’, which is ‘’ in this case.  But the address bar now shows a very long string which is the token for your app.


Next you need to copy and paste everything in the address bar but without the starting part ( ) to the ‘Parse response’ section, and hit enter key once. This way you can clearly see what the token is, what the scope is, and how long the token is valid. In this case, the token is valid for 1 week, which equals to 604800 seconds. Pretty good, right?



Step 3: Make request and get the data!

After you are done with the ‘Parse response’, the next step is ready for you automatically.


Justclick the ‘Send to’ link and ‘Launch Request’ in the new page. Make sure to tell the web that you are not a robot too.


After that, if you see the ‘200 OK’ status – meaning everything works fine. And you can find the data you want in the lower half of the page. Like this:


If you click ‘view raw’ at the right side, you will see the ‘BODY’ will be changed to raw text file and you can simply copy / paste them to a text editor. And that’s all you do to download your Fitbit second-level data! As I promised, you don’t need to know any programming skills to accomplish this, isn’t that cool?

Additional tips:

tip 1: other data types

If you want some other data rather than the ‘user profile’, you can simply change the ‘API endpoint URL’ in the ‘Make Request’ step. According the the Fitbit API documentation, you can get the heart rate data by using this URL:

Or the sleep data by using:

tip 2: save json file in an easier way

If you think the ‘Send to’ method is not fast enough, you can copy the ‘curl’ command auto-generated in the ‘Make Request’ step and run it in terminal (for Windows, that is the ‘cmd’ window). Add the following part to the end of the copied ‘curl’ command so that the data will be saved to your disk:

>> data_file.json

tip 3: save multiple days’ data automatically

I think Fitbit provides the functionality to let uers download multiple days’ data with one command, by specifying the date rage in the curl request. However I could not make it work for me for some unknown reason. Therefore I used a piece of Python code to download multiple days’ data for me. Here’s the code, which I think is pretty self-explanatory.

import requests
import json
import pandas as pd
from time import sleep
# put the token for your app in between the single quotes
token = ''
# make a list of dates 
# ref:
# You can change the start and end date as you want
# Just make sure to use the yyyy-mm-dd format
start_date = '2015-12-28'
end_date = '2016-06-14'
datelist = pd.date_range(start = pd.to_datetime(start_date),
                         end = pd.to_datetime(end_date)).tolist()
The codes below use a for loop to generate one URL for each day in the datelist,
and then request each day's data and save the data into individual json files.
Because Fitbit limit 150 request per hour, I let the code sleep for 30 seconds 
between each request, to meet this limitation.
for ts in datelist:
    date = ts.strftime('%Y-%m-%d')
    url = '' + date + '/1d/1sec/time/00:00/23:59.json'
    filename = 'HR'+ date +'.json'
    response = requests.get(url=url, headers={'Authorization':'Bearer ' + token})
    if response.ok:
        with open(filename, 'w') as f:
            json.dump(response.content, f)
        print (date + ' is saved!')
        print ('The file of %s is not saved due to error!' % date)

Happy hacking!

My Experience with Udacity Data Analyst Nano-degree

My Experience with Udacity Data Analyst Nano-degree

After spending most of my spare time in the past 8 months, I finally graduated from the Udacity Data Analyst Nano-Degree program! Before I started this program, I have spent many hours searching online for reviews and discussions about it. Now I would like to share my whole experience with the internet and hope it is helpful to someone like me. Since I have also taken other courses at and, I can make some direct comparisons which should be helpful too.

First, I would say it really requires a lot of time to finish the degree. I roughly spent 15 + hours each week on this program in the past 8 months. This maybe does not sound like a lot of time to you, but actually it is, especially if you have another full-time job. So don’t jump into it if you can’t afford the time. As for tuitions, I have paid $1,600 for the program but Udacity will refund half of it because I finished the program within 12 months and I paid all of my tuition out of my own pocket. I haven’t received the refund yet because Udacity told me it takes 4 – 8 weeks to process. Just don’t forget to submit a request for this refund – Udacity will not automatically refund it to you.

The 8 projects covered a wide range of aspects in the data science field, including statistics, Python programing, R programing, machine learning, and D3.js data visualization. The  Python and R programing focused on data manipulation, wrangling, and visualization. The machine learning course is really condensed and does not go deep in algorithms and theories, compared to other machine learning courses. Overall, this nano-degree really focuses on the analysis skills such as process data and find interesting stories. If you want to be a data scientist instead of data analyst, this nano-degree is probably not the best choice for you.

There are many things I really liked about this program. First, Udacity has an amazing ‘customer support’ team. The coaches provide 1-0n-1 help sessions. Of course these coaching sessions need to be reserved first, which is fairly easy to do. Each help session is scheduled to be 20 minutes long, but a coach once chatted with me for more than an hour, until I really solved the problem. I only used online text chatting but it seems the coaches are open to other communication methods such as video-chatting or phone call as well. In addition, the discussion forum is a good resource that helped me finishing all the projects. The coaches reply to questions VERY quickly, usually in 30 min or less. And they are always very patient! The coaches also review the project submission in great details, give constructive feedbacks, and encourage the students all the time. I think this coaching team is the factor that makes this nano-degree program stand out, compared to other MOOC courses or specializations.

However, I believe this program still has some room for improvement. My biggest frustrations came from the course videos. Maybe it is because Udacity only consider the course videos as supporting materials, or maybe it is because the course are taught by mentors from the industry, I felt that the course videos are nothing like a real class. For a substantial amount of portion, the videos are just two or more mentors talking. The course videos did not really help me too much in finishing my projects.  I like the course videos on much better because they are better organized and the contents are taught systematically. That is not the case with Udacity courses, at least for the data analyst nano-degree.

Another question people care about this program is that if it really help the students finding a job. Well, I can’t tell because I am just me, one sample, and there is not even a control sample. But at least the program gave me something to talk about data analysis during my interviews, so I would say, yes, it is useful.

Please feel free to comment below if you would like to take the program, are in the middle of the program, or have graduated. I’d be happy to answer any questions about this Udacity Data Analyst Nano-degree.