[noticket] Hide momentary flash of text on sparkline cells.
[noticket] Hide momentary flash of text on sparkline cells.

file:a/.gitignore -> file:b/.gitignore
*.py[co] *.py[co]
*.py~ *.py~
.gitignore .gitignore
  ckan.log
   
# Packages # Packages
*.egg *.egg
*.egg-info *.egg-info
dist dist
build build
eggs eggs
parts parts
bin bin
var var
sdist sdist
develop-eggs develop-eggs
.installed.cfg .installed.cfg
   
# Private info # Private info
credentials.json credentials.json
token.dat token.dat
   
# Installer logs # Installer logs
pip-log.txt pip-log.txt
   
# Unit test / coverage reports # Unit test / coverage reports
.coverage .coverage
.tox .tox
   
#Translations #Translations
*.mo *.mo
   
#Mr Developer #Mr Developer
.mr.developer.cfg .mr.developer.cfg
   
file:a/README.rst -> file:b/README.rst
ckanext-ga-report ckanext-ga-report
================= =================
   
**Status:** Development **Status:** Development
   
**CKAN Version:** 1.7.1+ **CKAN Version:** 1.7.1+
   
   
Overview Overview
-------- --------
   
For creating detailed reports of CKAN analytics, including totals per group. For creating detailed reports of CKAN analytics, including totals per group.
   
Whereas ckanext-googleanalytics focusses on providing page view stats a recent period and for all time (aimed at end users), ckanext-ga-report is more interested in building regular periodic reports (more for site managers to monitor). Whereas ckanext-googleanalytics focusses on providing page view stats a recent period and for all time (aimed at end users), ckanext-ga-report is more interested in building regular periodic reports (more for site managers to monitor).
   
Contents of this extension: Contents of this extension:
   
* Use the CLI tool to download Google Analytics data for each time period into this extension's database tables * Use the CLI tool to download Google Analytics data for each time period into this extension's database tables
   
* Users can view the data as web page reports * Users can view the data as web page reports
   
   
Installation Installation
------------ ------------
   
1. Activate you CKAN python environment and install this extension's software:: 1. Activate you CKAN python environment and install this extension's software::
   
$ pyenv/bin/activate $ pyenv/bin/activate
$ pip install -e git+https://github.com/datagovuk/ckanext-ga-report.git#egg=ckanext-ga-report $ pip install -e git+https://github.com/datagovuk/ckanext-ga-report.git#egg=ckanext-ga-report
   
2. Ensure you development.ini (or similar) contains the info about your Google Analytics account and configuration:: 2. Ensure you development.ini (or similar) contains the info about your Google Analytics account and configuration::
   
googleanalytics.id = UA-1010101-1 googleanalytics.id = UA-1010101-1
googleanalytics.account = Account name (e.g. data.gov.uk, see top level item at https://www.google.com/analytics) googleanalytics.account = Account name (e.g. data.gov.uk, see top level item at https://www.google.com/analytics)
  googleanalytics.token.filepath = ~/pyenv/token.dat
ga-report.period = monthly ga-report.period = monthly
ga-report.bounce_url = /data ga-report.bounce_url = /
   
The ga-report.bounce_url specifies the path to use when calculating bounces. For DGU this is /data The ga-report.bounce_url specifies a particular path to record the bounce rate for. Typically it is / (the home page).
but you may want to set this to /.  
   
3. Set up this extension's database tables using a paster command. (Ensure your CKAN pyenv is still activated, run the command from ``src/ckanext-ga-report``, alter the ``--config`` option to point to your site config file):: 3. Set up this extension's database tables using a paster command. (Ensure your CKAN pyenv is still activated, run the command from ``src/ckanext-ga-report``, alter the ``--config`` option to point to your site config file)::
   
$ paster initdb --config=../ckan/development.ini $ paster initdb --config=../ckan/development.ini
   
4. Enable the extension in your CKAN config file by adding it to ``ckan.plugins``:: 4. Enable the extension in your CKAN config file by adding it to ``ckan.plugins``::
   
ckan.plugins = ga-report ckan.plugins = ga-report
   
Problem shooting Problem shooting
---------------- ----------------
   
* ``(ProgrammingError) relation "ga_url" does not exist`` * ``(ProgrammingError) relation "ga_url" does not exist``
This means that the ``paster initdb`` step has not been run successfully. Refer to the installation instructions for this extension. This means that the ``paster initdb`` step has not been run successfully. Refer to the installation instructions for this extension.
   
   
Authorization Authorization
-------------- --------------
   
Before you can access the data, you need to set up the OAUTH details which you can do by following the `instructions <https://developers.google.com/analytics/resources/tutorials/hello-analytics-api>`_ the outcome of which will be a file called credentials.json which should look like credentials.json.template with the relevant fields completed. These steps are below for convenience: Before you can access the data, you need to set up the OAUTH details which you can do by following the `instructions <https://developers.google.com/analytics/resources/tutorials/hello-analytics-api>`_ the outcome of which will be a file called credentials.json which should look like credentials.json.template with the relevant fields completed. These steps are below for convenience:
   
1. Visit the `Google APIs Console <https://code.google.com/apis/console>`_ 1. Visit the `Google APIs Console <https://code.google.com/apis/console>`_
   
2. Sign-in and create a project or use an existing project. 2. Sign-in and create a project or use an existing project.
   
3. In the `Services pane <https://code.google.com/apis/console#:services>`_ , activate Analytics API for your project. If prompted, read and accept the terms of service. 3. In the `Services pane <https://code.google.com/apis/console#:services>`_ , activate Analytics API for your project. If prompted, read and accept the terms of service.
   
4. Go to the `API Access pane <https://code.google.com/apis/console/#:access>`_ 4. Go to the `API Access pane <https://code.google.com/apis/console/#:access>`_
   
5. Click Create an OAuth 2.0 client ID.... 5. Click Create an OAuth 2.0 client ID....
   
6. Fill out the Branding Information fields and click Next. 6. Fill out the Branding Information fields and click Next.
   
7. In Client ID Settings, set Application type to Installed application. 7. In Client ID Settings, set Application type to Installed application.
   
8. Click Create client ID 8. Click Create client ID
   
9. The details you need below are Client ID, Client secret, and Redirect URIs 9. The details you need below are Client ID, Client secret, and Redirect URIs
   
   
Once you have set up your credentials.json file you can generate an oauth token file by using the Once you have set up your credentials.json file you can generate an oauth token file by using the
following command, which will store your oauth token in a file called token.dat once you have finished following command, which will store your oauth token in a file called token.dat once you have finished
giving permission in the browser:: giving permission in the browser::
   
$ paster getauthtoken --config=../ckan/development.ini $ paster getauthtoken --config=../ckan/development.ini
   
  Now ensure you reference the correct path to your token.dat in your CKAN config file (e.g. development.ini)::
   
  googleanalytics.token.filepath = ~/pyenv/token.dat
   
   
Tutorial Tutorial
-------- --------
   
Download some GA data and store it in CKAN's database. (Ensure your CKAN pyenv is still activated, run the command from ``src/ckanext-ga-report``, alter the ``--config`` option to point to your site config file) and specifying the name of your auth file (token.dat by default) from the previous step:: Download some GA data and store it in CKAN's database. (Ensure your CKAN pyenv is still activated, run the command from ``src/ckanext-ga-report``, alter the ``--config`` option to point to your site config file) and specifying the name of your auth file (token.dat by default) from the previous step::
   
$ paster loadanalytics token.dat latest --config=../ckan/development.ini $ paster loadanalytics latest --config=../ckan/development.ini
   
The value after the token file is how much data you want to retrieve, this can be The value after the token file is how much data you want to retrieve, this can be
   
* **all** - data for all time (since 2010) * **all** - data for all time (since 2010)
   
* **latest** - (default) just the 'latest' data * **latest** - (default) just the 'latest' data
   
* **YYYY-MM-DD** - just data for all time periods going back to (and including) this date * **YYYY-MM-DD** - just data for all time periods going back to (and including) this date
   
   
   
Software Licence Software Licence
================ ================
   
This software is developed by Cabinet Office. It is Crown Copyright and opened up under the Open Government Licence (OGL) (which is compatible with Creative Commons Attibution License). This software is developed by Cabinet Office. It is Crown Copyright and opened up under the Open Government Licence (OGL) (which is compatible with Creative Commons Attibution License).
   
OGL terms: http://www.nationalarchives.gov.uk/doc/open-government-licence/ OGL terms: http://www.nationalarchives.gov.uk/doc/open-government-licence/
   
import logging import logging
import datetime import datetime
  import os
   
  from pylons import config
   
from ckan.lib.cli import CkanCommand from ckan.lib.cli import CkanCommand
# No other CKAN imports allowed until _load_config is run, # No other CKAN imports allowed until _load_config is run,
# or logging is disabled # or logging is disabled
   
   
class InitDB(CkanCommand): class InitDB(CkanCommand):
"""Initialise the extension's database tables """Initialise the extension's database tables
""" """
summary = __doc__.split('\n')[0] summary = __doc__.split('\n')[0]
usage = __doc__ usage = __doc__
max_args = 0 max_args = 0
min_args = 0 min_args = 0
   
def command(self): def command(self):
self._load_config() self._load_config()
   
import ckan.model as model import ckan.model as model
model.Session.remove() model.Session.remove()
model.Session.configure(bind=model.meta.engine) model.Session.configure(bind=model.meta.engine)
log = logging.getLogger('ckanext.ga-report') log = logging.getLogger('ckanext.ga_report')
   
import ga_model import ga_model
ga_model.init_tables() ga_model.init_tables()
log.info("DB tables are setup") log.info("DB tables are setup")
   
   
class GetAuthToken(CkanCommand): class GetAuthToken(CkanCommand):
""" Get's the Google auth token """ Get's the Google auth token
   
Usage: paster getauthtoken <credentials_file> Usage: paster getauthtoken <credentials_file>
   
Where <credentials_file> is the file name containing the details Where <credentials_file> is the file name containing the details
for the service (obtained from https://code.google.com/apis/console). for the service (obtained from https://code.google.com/apis/console).
By default this is set to credentials.json By default this is set to credentials.json
""" """
summary = __doc__.split('\n')[0] summary = __doc__.split('\n')[0]
usage = __doc__ usage = __doc__
max_args = 0 max_args = 0
min_args = 0 min_args = 0
   
def command(self): def command(self):
""" """
In this case we don't want a valid service, but rather just to In this case we don't want a valid service, but rather just to
force the user through the auth flow. We allow this to complete to force the user through the auth flow. We allow this to complete to
act as a form of verification instead of just getting the token and act as a form of verification instead of just getting the token and
assuming it is correct. assuming it is correct.
""" """
from ga_auth import init_service from ga_auth import init_service
init_service('token.dat', init_service('token.dat',
self.args[0] if self.args self.args[0] if self.args
else 'credentials.json') else 'credentials.json')
   
  class FixTimePeriods(CkanCommand):
  """
  Fixes the 'All' records for GA_Urls
   
  It is possible that older urls that haven't recently been visited
  do not have All records. This command will traverse through those
  records and generate valid All records for them.
  """
  summary = __doc__.split('\n')[0]
  usage = __doc__
  max_args = 0
  min_args = 0
   
  def __init__(self, name):
  super(FixTimePeriods, self).__init__(name)
   
  def command(self):
  import ckan.model as model
  from ga_model import post_update_url_stats
  self._load_config()
  model.Session.remove()
  model.Session.configure(bind=model.meta.engine)
   
  log = logging.getLogger('ckanext.ga_report')
   
  log.info("Updating 'All' records for old URLs")
  post_update_url_stats()
  log.info("Processing complete")
   
   
   
class LoadAnalytics(CkanCommand): class LoadAnalytics(CkanCommand):
"""Get data from Google Analytics API and save it """Get data from Google Analytics API and save it
in the ga_model in the ga_model
   
Usage: paster loadanalytics <tokenfile> <time-period> Usage: paster loadanalytics <time-period>
   
Where <tokenfile> is the name of the auth token file from Where <time-period> is:
the getauthtoken step.  
   
And where <time-period> is:  
all - data for all time all - data for all time
latest - (default) just the 'latest' data latest - (default) just the 'latest' data
YYYY-MM - just data for the specific month YYYY-MM - just data for the specific month
""" """
summary = __doc__.split('\n')[0] summary = __doc__.split('\n')[0]
usage = __doc__ usage = __doc__
max_args = 2 max_args = 1
min_args = 1 min_args = 0
   
def __init__(self, name): def __init__(self, name):
super(LoadAnalytics, self).__init__(name) super(LoadAnalytics, self).__init__(name)
self.parser.add_option('-d', '--delete-first', self.parser.add_option('-d', '--delete-first',
action='store_true', action='store_true',
default=False, default=False,
dest='delete_first', dest='delete_first',
help='Delete data for the period first') help='Delete data for the period first')
  self.parser.add_option('-s', '--skip_url_stats',
  action='store_true',
  default=False,
  dest='skip_url_stats',
  help='Skip the download of URL data - just do site-wide stats')
   
def command(self): def command(self):
self._load_config() self._load_config()
   
from download_analytics import DownloadAnalytics from download_analytics import DownloadAnalytics
from ga_auth import (init_service, get_profile_id) from ga_auth import (init_service, get_profile_id)
   
  ga_token_filepath = os.path.expanduser(config.get('googleanalytics.token.filepath', ''))
  if not ga_token_filepath:
  print 'ERROR: In the CKAN config you need to specify the filepath of the ' \
  'Google Analytics token file under key: googleanalytics.token.filepath'
  return
   
try: try:
svc = init_service(self.args[0], None) svc = init_service(ga_token_filepath, None)
except TypeError: except TypeError:
print ('Have you correctly run the getauthtoken task and ' print ('Have you correctly run the getauthtoken task and '
'specified the correct token file?') 'specified the correct token file in the CKAN config under '
  '"googleanalytics.token.filepath"?')
return return
   
downloader = DownloadAnalytics(svc, profile_id=get_profile_id(svc), downloader = DownloadAnalytics(svc, profile_id=get_profile_id(svc),
delete_first=self.options.delete_first) delete_first=self.options.delete_first,
  skip_url_stats=self.options.skip_url_stats)
   
time_period = self.args[1] if self.args and len(self.args) > 1 \ time_period = self.args[0] if self.args else 'latest'
else 'latest'  
if time_period == 'all': if time_period == 'all':
downloader.all_() downloader.all_()
elif time_period == 'latest': elif time_period == 'latest':
downloader.latest() downloader.latest()
else: else:
# The month to use # The month to use
for_date = datetime.datetime.strptime(time_period, '%Y-%m') for_date = datetime.datetime.strptime(time_period, '%Y-%m')
downloader.specific_month(for_date) downloader.specific_month(for_date)
   
import re import re
import csv import csv
import sys import sys
  import json
import logging import logging
import operator import operator
import collections import collections
from ckan.lib.base import (BaseController, c, g, render, request, response, abort) from ckan.lib.base import (BaseController, c, g, render, request, response, abort)
   
import sqlalchemy import sqlalchemy
from sqlalchemy import func, cast, Integer from sqlalchemy import func, cast, Integer
import ckan.model as model import ckan.model as model
from ga_model import GA_Url, GA_Stat, GA_ReferralStat from ga_model import GA_Url, GA_Stat, GA_ReferralStat, GA_Publisher
   
log = logging.getLogger('ckanext.ga-report') log = logging.getLogger('ckanext.ga-report')
   
  DOWNLOADS_AVAILABLE_FROM = '2012-12'
   
def _get_month_name(strdate): def _get_month_name(strdate):
import calendar import calendar
from time import strptime from time import strptime
d = strptime(strdate, '%Y-%m') d = strptime(strdate, '%Y-%m')
return '%s %s' % (calendar.month_name[d.tm_mon], d.tm_year) return '%s %s' % (calendar.month_name[d.tm_mon], d.tm_year)
   
  def _get_unix_epoch(strdate):
def _month_details(cls): from time import strptime,mktime
'''Returns a list of all the month names''' d = strptime(strdate, '%Y-%m')
  return int(mktime(d))
   
  def _month_details(cls, stat_key=None):
  '''
  Returns a list of all the periods for which we have data, unfortunately
  knows too much about the type of the cls being passed as GA_Url has a
  more complex query
   
  This may need extending if we add a period_name to the stats
  '''
months = [] months = []
vals = model.Session.query(cls.period_name).filter(cls.period_name!='All').distinct().all() day = None
   
  q = model.Session.query(cls.period_name,cls.period_complete_day)\
  .filter(cls.period_name!='All').distinct(cls.period_name)
  if stat_key:
  q= q.filter(cls.stat_name==stat_key)
   
  vals = q.order_by("period_name desc").all()
   
  if vals and vals[0][1]:
  day = int(vals[0][1])
  ordinal = 'th' if 11 <= day <= 13 \
  else {1:'st',2:'nd',3:'rd'}.get(day % 10, 'th')
  day = "{day}{ordinal}".format(day=day, ordinal=ordinal)
   
for m in vals: for m in vals:
months.append( (m[0], _get_month_name(m[0]))) months.append( (m[0], _get_month_name(m[0])))
return sorted(months, key=operator.itemgetter(0), reverse=True)  
  return months, day
   
   
class GaReport(BaseController): class GaReport(BaseController):
   
def csv(self, month): def csv(self, month):
import csv import csv
   
q = model.Session.query(GA_Stat) q = model.Session.query(GA_Stat).filter(GA_Stat.stat_name!='Downloads')
if month != 'all': if month != 'all':
q = q.filter(GA_Stat.period_name==month) q = q.filter(GA_Stat.period_name==month)
entries = q.order_by('GA_Stat.period_name, GA_Stat.stat_name, GA_Stat.key').all() entries = q.order_by('GA_Stat.period_name, GA_Stat.stat_name, GA_Stat.key').all()
   
response.headers['Content-Type'] = "text/csv; charset=utf-8" response.headers['Content-Type'] = "text/csv; charset=utf-8"
response.headers['Content-Disposition'] = str('attachment; filename=stats_%s.csv' % (month,)) response.headers['Content-Disposition'] = str('attachment; filename=stats_%s.csv' % (month,))
   
writer = csv.writer(response) writer = csv.writer(response)
writer.writerow(["Period", "Statistic", "Key", "Value"]) writer.writerow(["Period", "Statistic", "Key", "Value"])
   
for entry in entries: for entry in entries:
writer.writerow([entry.period_name.encode('utf-8'), writer.writerow([entry.period_name.encode('utf-8'),
entry.stat_name.encode('utf-8'), entry.stat_name.encode('utf-8'),
entry.key.encode('utf-8'), entry.key.encode('utf-8'),
entry.value.encode('utf-8')]) entry.value.encode('utf-8')])
   
   
def index(self): def index(self):
   
# Get the month details by fetching distinct values and determining the # Get the month details by fetching distinct values and determining the
# month names from the values. # month names from the values.
c.months = _month_details(GA_Stat) c.months, c.day = _month_details(GA_Stat)
   
# Work out which month to show, based on query params of the first item # Work out which month to show, based on query params of the first item
c.month_desc = 'all months' c.month_desc = 'all months'
c.month = request.params.get('month', '') c.month = request.params.get('month', '')
if c.month: if c.month:
c.month_desc = ''.join([m[1] for m in c.months if m[0]==c.month]) c.month_desc = ''.join([m[1] for m in c.months if m[0]==c.month])
   
q = model.Session.query(GA_Stat).\ q = model.Session.query(GA_Stat).\
filter(GA_Stat.stat_name=='Totals') filter(GA_Stat.stat_name=='Totals')
if c.month: if c.month:
q = q.filter(GA_Stat.period_name==c.month) q = q.filter(GA_Stat.period_name==c.month)
entries = q.order_by('ga_stat.key').all() entries = q.order_by('ga_stat.key').all()
   
def clean_key(key, val): def clean_key(key, val):
if key in ['Average time on site', 'Pages per visit', 'New visits', 'Bounces']: if key in ['Average time on site', 'Pages per visit', 'New visits', 'Bounce rate (home page)']:
val = "%.2f" % round(float(val), 2) val = "%.2f" % round(float(val), 2)
if key == 'Average time on site': if key == 'Average time on site':
mins, secs = divmod(float(val), 60) mins, secs = divmod(float(val), 60)
hours, mins = divmod(mins, 60) hours, mins = divmod(mins, 60)
val = '%02d:%02d:%02d (%s seconds) ' % (hours, mins, secs, val) val = '%02d:%02d:%02d (%s seconds) ' % (hours, mins, secs, val)
if key in ['New visits','Bounces']: if key in ['New visits','Bounce rate (home page)']:
val = "%s%%" % val val = "%s%%" % val
if key in ['Total page views', 'Total visits']: if key in ['Total page views', 'Total visits']:
val = int(val) val = int(val)
   
return key, val return key, val
   
  # Query historic values for sparkline rendering
  sparkline_query = model.Session.query(GA_Stat)\
  .filter(GA_Stat.stat_name=='Totals')\
  .order_by(GA_Stat.period_name)
  sparkline_data = {}
  for x in sparkline_query:
  sparkline_data[x.key] = sparkline_data.get(x.key,[])
  key, val = clean_key(x.key,float(x.value))
  tooltip = '%s: %s' % (_get_month_name(x.period_name), val)
  sparkline_data[x.key].append( (tooltip,x.value) )
  # Trim the latest month, as it looks like a huge dropoff
  for key in sparkline_data:
  sparkline_data[key] = sparkline_data[key][:-1]
   
c.global_totals = [] c.global_totals = []
if c.month: if c.month:
for e in entries: for e in entries:
key, val = clean_key(e.key, e.value) key, val = clean_key(e.key, e.value)
c.global_totals.append((key, val)) sparkline = sparkline_data[e.key]
  c.global_totals.append((key, val, sparkline))
else: else:
d = collections.defaultdict(list) d = collections.defaultdict(list)
for e in entries: for e in entries:
d[e.key].append(float(e.value)) d[e.key].append(float(e.value))
for k, v in d.iteritems(): for k, v in d.iteritems():
if k in ['Total page views', 'Total visits']: if k in ['Total page views', 'Total visits']:
v = sum(v) v = sum(v)
else: else:
v = float(sum(v))/len(v) v = float(sum(v))/float(len(v))
  sparkline = sparkline_data[k]
key, val = clean_key(k,v) key, val = clean_key(k,v)
   
c.global_totals.append((key, val)) c.global_totals.append((key, val, sparkline))
c.global_totals = sorted(c.global_totals, key=operator.itemgetter(0)) # Sort the global totals into a more pleasant order
  def sort_func(x):
  key = x[0]
  total_order = ['Total page views','Total visits','Pages per visit']
  if key in total_order:
  return total_order.index(key)
  return 999
  c.global_totals = sorted(c.global_totals, key=sort_func)
   
keys = { keys = {
'Browser versions': 'browser_versions', 'Browser versions': 'browser_versions',
'Browsers': 'browsers', 'Browsers': 'browsers',
'Operating Systems versions': 'os_versions', 'Operating Systems versions': 'os_versions',
'Operating Systems': 'os', 'Operating Systems': 'os',
'Social sources': 'social_networks', 'Social sources': 'social_networks',
'Languages': 'languages', 'Languages': 'languages',
'Country': 'country' 'Country': 'country'
} }
   
def shorten_name(name, length=60): def shorten_name(name, length=60):
return (name[:length] + '..') if len(name) > 60 else name return (name[:length] + '..') if len(name) > 60 else name
   
def fill_out_url(url): def fill_out_url(url):
import urlparse import urlparse
return urlparse.urljoin(g.site_url, url) return urlparse.urljoin(g.site_url, url)
   
c.social_referrer_totals, c.social_referrers = [], [] c.social_referrer_totals, c.social_referrers = [], []
q = model.Session.query(GA_ReferralStat) q = model.Session.query(GA_ReferralStat)
q = q.filter(GA_ReferralStat.period_name==c.month) if c.month else q q = q.filter(GA_ReferralStat.period_name==c.month) if c.month else q
q = q.order_by('ga_referrer.count::int desc') q = q.order_by('ga_referrer.count::int desc')
for entry in q.all(): for entry in q.all():
c.social_referrers.append((shorten_name(entry.url), fill_out_url(entry.url), c.social_referrers.append((shorten_name(entry.url), fill_out_url(entry.url),
entry.source,entry.count)) entry.source,entry.count))
   
q = model.Session.query(GA_ReferralStat.url, q = model.Session.query(GA_ReferralStat.url,
func.sum(GA_ReferralStat.count).label('count')) func.sum(GA_ReferralStat.count).label('count'))
q = q.filter(GA_ReferralStat.period_name==c.month) if c.month else q q = q.filter(GA_ReferralStat.period_name==c.month) if c.month else q
q = q.order_by('count desc').group_by(GA_ReferralStat.url) q = q.order_by('count desc').group_by(GA_ReferralStat.url)
for entry in q.all(): for entry in q.all():
c.social_referrer_totals.append((shorten_name(entry[0]), fill_out_url(entry[0]),'', c.social_referrer_totals.append((shorten_name(entry[0]), fill_out_url(entry[0]),'',
entry[1])) entry[1]))
   
for k, v in keys.iteritems(): for k, v in keys.iteritems():
q = model.Session.query(GA_Stat).\ q = model.Session.query(GA_Stat).\
filter(GA_Stat.stat_name==k) filter(GA_Stat.stat_name==k).\
  order_by(GA_Stat.period_name)
  # Buffer the tabular data
if c.month: if c.month:
entries = [] entries = []
q = q.filter(GA_Stat.period_name==c.month).\ q = q.filter(GA_Stat.period_name==c.month).\
order_by('ga_stat.value::int desc') order_by('ga_stat.value::int desc')
   
d = collections.defaultdict(int) d = collections.defaultdict(int)
for e in q.all(): for e in q.all():
d[e.key] += int(e.value) d[e.key] += int(e.value)
entries = [] entries = []
for key, val in d.iteritems(): for key, val in d.iteritems():
entries.append((key,val,)) entries.append((key,val,))
entries = sorted(entries, key=operator.itemgetter(1), reverse=True) entries = sorted(entries, key=operator.itemgetter(1), reverse=True)
   
  # Run a query on all months to gather graph data
  graph_query = model.Session.query(GA_Stat).\
  filter(GA_Stat.stat_name==k).\
  order_by(GA_Stat.period_name)
  graph_dict = {}
  for stat in graph_query:
  graph_dict[ stat.key ] = graph_dict.get(stat.key,{
  'name':stat.key,
  'data': []
  })
  graph_dict[ stat.key ]['data'].append({
  'x':_get_unix_epoch(stat.period_name),
  'y':float(stat.value)
  })
  graph = [ graph_dict[x[0]] for x in entries ]
  setattr(c, v+'_graph', json.dumps( _to_rickshaw(graph,percentageMode=True) ))
   
# Get the total for each set of values and then set the value as # Get the total for each set of values and then set the value as
# a percentage of the total # a percentage of the total
if k == 'Social sources': if k == 'Social sources':
total = sum([x for n,x in c.global_totals if n == 'Total visits']) total = sum([x for n,x,graph in c.global_totals if n == 'Total visits'])
else: else:
total = sum([num for _,num in entries]) total = sum([num for _,num in entries])
setattr(c, v, [(k,_percent(v,total)) for k,v in entries ]) setattr(c, v, [(k,_percent(v,total)) for k,v in entries ])
   
return render('ga_report/site/index.html') return render('ga_report/site/index.html')
   
   
class GaDatasetReport(BaseController): class GaDatasetReport(BaseController):
""" """
Displays the pageview and visit count for datasets Displays the pageview and visit count for datasets
with options to filter by publisher and time period. with options to filter by publisher and time period.
""" """
def publisher_csv(self, month): def publisher_csv(self, month):
''' '''
Returns a CSV of each publisher with the total number of dataset Returns a CSV of each publisher with the total number of dataset
views & visits. views & visits.
''' '''
c.month = month if not month == 'all' else '' c.month = month if not month == 'all' else ''
response.headers['Content-Type'] = "text/csv; charset=utf-8" response.headers['Content-Type'] = "text/csv; charset=utf-8"
response.headers['Content-Disposition'] = str('attachment; filename=publishers_%s.csv' % (month,)) response.headers['Content-Disposition'] = str('attachment; filename=publishers_%s.csv' % (month,))
   
writer = csv.writer(response) writer = csv.writer(response)
writer.writerow(["Publisher Title", "Publisher Name", "Views", "Visits", "Period Name"]) writer.writerow(["Publisher Title", "Publisher Name", "Views", "Visits", "Period Name"])
   
for publisher,view,visit in _get_top_publishers(None): top_publishers, top_publishers_graph = _get_top_publishers(None)
   
  for publisher,view,visit in top_publishers:
writer.writerow([publisher.title.encode('utf-8'), writer.writerow([publisher.title.encode('utf-8'),
publisher.name.encode('utf-8'), publisher.name.encode('utf-8'),
view, view,
visit, visit,
month]) month])
   
def dataset_csv(self, id='all', month='all'): def dataset_csv(self, id='all', month='all'):
''' '''
Returns a CSV with the number of views & visits for each dataset. Returns a CSV with the number of views & visits for each dataset.
   
:param id: A Publisher ID or None if you want for all :param id: A Publisher ID or None if you want for all
:param month: The time period, or 'all' :param month: The time period, or 'all'
''' '''
c.month = month if not month == 'all' else '' c.month = month if not month == 'all' else ''
if id != 'all': if id != 'all':
c.publisher = model.Group.get(id) c.publisher = model.Group.get(id)
if not c.publisher: if not c.publisher:
abort(404, 'A publisher with that name could not be found') abort(404, 'A publisher with that name could not be found')
   
packages = self._get_packages(c.publisher) packages = self._get_packages(c.publisher)
response.headers['Content-Type'] = "text/csv; charset=utf-8" response.headers['Content-Type'] = "text/csv; charset=utf-8"
response.headers['Content-Disposition'] = \ response.headers['Content-Disposition'] = \
str('attachment; filename=datasets_%s_%s.csv' % (c.publisher_name, month,)) str('attachment; filename=datasets_%s_%s.csv' % (c.publisher_name, month,))
   
writer = csv.writer(response) writer = csv.writer(response)
writer.writerow(["Dataset Title", "Dataset Name", "Views", "Visits", "Period Name"]) writer.writerow(["Dataset Title", "Dataset Name", "Views", "Visits", "Resource downloads", "Period Name"])
   
for package,view,visit in packages: for package,view,visit,downloads in packages:
writer.writerow([package.title.encode('utf-8'), writer.writerow([package.title.encode('utf-8'),
package.name.encode('utf-8'), package.name.encode('utf-8'),
view, view,
visit, visit,
  downloads,
month]) month])
   
def publishers(self): def publishers(self):
'''A list of publishers and the number of views/visits for each''' '''A list of publishers and the number of views/visits for each'''
   
# Get the month details by fetching distinct values and determining the # Get the month details by fetching distinct values and determining the
# month names from the values. # month names from the values.
c.months = _month_details(GA_Url) c.months, c.day = _month_details(GA_Url)
   
# Work out which month to show, based on query params of the first item # Work out which month to show, based on query params of the first item
c.month = request.params.get('month', '') c.month = request.params.get('month', '')
c.month_desc = 'all months' c.month_desc = 'all months'
if c.month: if c.month:
c.month_desc = ''.join([m[1] for m in c.months if m[0]==c.month]) c.month_desc = ''.join([m[1] for m in c.months if m[0]==c.month])
   
c.top_publishers = _get_top_publishers() c.top_publishers, graph_data = _get_top_publishers()
  c.top_publishers_graph = json.dumps( _to_rickshaw(graph_data) )
   
return render('ga_report/publisher/index.html') return render('ga_report/publisher/index.html')
   
def _get_packages(self, publisher=None, count=-1): def _get_packages(self, publisher=None, count=-1):
'''Returns the datasets in order of visits''' '''Returns the datasets in order of views'''
if count == -1: have_download_data = True
count = sys.maxint  
   
month = c.month or 'All' month = c.month or 'All'
  if month != 'All':
  have_download_data = month >= DOWNLOADS_AVAILABLE_FROM
   
q = model.Session.query(GA_Url,model.Package)\ q = model.Session.query(GA_Url,model.Package)\
.filter(model.Package.name==GA_Url.package_id)\ .filter(model.Package.name==GA_Url.package_id)\
.filter(GA_Url.url.like('/dataset/%')) .filter(GA_Url.url.like('/dataset/%'))
if publisher: if publisher:
q = q.filter(GA_Url.department_id==publisher.name) q = q.filter(GA_Url.department_id==publisher.name)
q = q.filter(GA_Url.period_name==month) q = q.filter(GA_Url.period_name==month)
q = q.order_by('ga_url.visitors::int desc') q = q.order_by('ga_url.pageviews::int desc')
top_packages = [] top_packages = []
  if count == -1:
for entry,package in q.limit(count): entries = q.all()
  else:
  entries = q.limit(count)
   
  for entry,package in entries:
if package: if package:
top_packages.append((package, entry.pageviews, entry.visitors)) # Downloads ....
  if have_download_data:
  dls = model.Session.query(GA_Stat).\
  filter(GA_Stat.stat_name=='Downloads').\
  filter(GA_Stat.key==package.name)
  if month != 'All': # Fetch everything unless the month is specific
  dls = dls.filter(GA_Stat.period_name==month)
  downloads = 0
  for x in dls:
  downloads += int(x.value)
  else:
  downloads = 'No data'
  top_packages.append((package, entry.pageviews, entry.visits, downloads))
else: else:
log.warning('Could not find package associated package') log.warning('Could not find package associated package')
   
return top_packages return top_packages
   
def read(self): def read(self):
''' '''
Lists the most popular datasets across all publishers Lists the most popular datasets across all publishers
''' '''
return self.read_publisher(None) return self.read_publisher(None)
   
def read_publisher(self, id): def read_publisher(self, id):
''' '''
Lists the most popular datasets for a publisher (or across all publishers) Lists the most popular datasets for a publisher (or across all publishers)
''' '''
count = 20 count = 20
   
c.publishers = _get_publishers() c.publishers = _get_publishers()
   
id = request.params.get('publisher', id) id = request.params.get('publisher', id)
if id and id != 'all': if id and id != 'all':
c.publisher = model.Group.get(id) c.publisher = model.Group.get(id)
if not c.publisher: if not c.publisher:
abort(404, 'A publisher with that name could not be found') abort(404, 'A publisher with that name could not be found')
c.publisher_name = c.publisher.name c.publisher_name = c.publisher.name
c.top_packages = [] # package, dataset_views in c.top_packages c.top_packages = [] # package, dataset_views in c.top_packages
   
# Get the month details by fetching distinct values and determining the # Get the month details by fetching distinct values and determining the
# month names from the values. # month names from the values.
c.months = _month_details(GA_Url) c.months, c.day = _month_details(GA_Url)
   
# Work out which month to show, based on query params of the first item # Work out which month to show, based on query params of the first item
c.month = request.params.get('month', '') c.month = request.params.get('month', '')
if not c.month: if not c.month:
c.month_desc = 'all months' c.month_desc = 'all months'
else: else:
c.month_desc = ''.join([m[1] for m in c.months if m[0]==c.month]) c.month_desc = ''.join([m[1] for m in c.months if m[0]==c.month])
   
month = c.mnth or 'All' month = c.month or 'All'
c.publisher_page_views = 0 c.publisher_page_views = 0
q = model.Session.query(GA_Url).\ q = model.Session.query(GA_Url).\
filter(GA_Url.url=='/publisher/%s' % c.publisher_name) filter(GA_Url.url=='/publisher/%s' % c.publisher_name)
entry = q.filter(GA_Url.period_name==c.month).first() entry = q.filter(GA_Url.period_name==c.month).first()
c.publisher_page_views = entry.pageviews if entry else 0 c.publisher_page_views = entry.pageviews if entry else 0
   
c.top_packages = self._get_packages(c.publisher, 20) c.top_packages = self._get_packages(c.publisher, 20)
   
  # Graph query
  top_package_names = [ x[0].name for x in c.top_packages ]
  graph_query = model.Session.query(GA_Url,model.Package)\
  .filter(model.Package.name==GA_Url.package_id)\
  .filter(GA_Url.url.like('/dataset/%'))\
  .filter(GA_Url.package_id.in_(top_package_names))
  graph_dict = {}
  for entry,package in graph_query:
  if not package: continue
  if entry.period_name=='All': continue
  graph_dict[package.name] = graph_dict.get(package.name,{
  'name':package.title,
  'data':[]
  })
  graph_dict[package.name]['data'].append({
  'x':_get_unix_epoch(entry.period_name),
  'y':int(entry.pageviews),
  })
  graph = [ graph_dict[x] for x in top_package_names ]
   
  c.graph_data = json.dumps( _to_rickshaw(graph) )
   
return render('ga_report/publisher/read.html') return render('ga_report/publisher/read.html')
   
  def _to_rickshaw(data, percentageMode=False):
  if data==[]:
  return data
  # Create a consistent x-axis between all series
  num_points = [ len(series['data']) for series in data ]
  ideal_index = num_points.index( max(num_points) )
  x_axis = []
  for series in data:
  for point in series['data']:
  x_axis.append(point['x'])
  x_axis = sorted( list( set(x_axis) ) )
  # Zero pad any missing values
  for series in data:
  xs = [ point['x'] for point in series['data'] ]
  for x in set(x_axis).difference(set(xs)):
  series['data'].append( {'x':x, 'y':0} )
  if percentageMode:
  def get_totals(series_list):
  totals = {}
  for series in series_list:
  for point in series['data']:
  totals[point['x']] = totals.get(point['x'],0) + point['y']
  return totals
  # Transform data into percentage stacks
  totals = get_totals(data)
  # Roll insignificant series into a catch-all
  THRESHOLD = 0.01
  raw_data = data
  data = []
  for series in raw_data:
  for point in series['data']:
  fraction = float(point['y']) / totals[point['x']]
  if not (series in data) and fraction>THRESHOLD:
  data.append(series)
  # Overwrite data with a set of intereting series
  others = [ x for x in raw_data if not (x in data) ]
  data.append({
  'name':'Other',
  'data': [ {'x':x,'y':y} for x,y in get_totals(others).items() ]
  })
  # Turn each point into a percentage
  for series in data:
  for point in series['data']:
  point['y'] = (point['y']*100) / totals[point['x']]
  # Sort the points
  for series in data:
  series['data'] = sorted( series['data'], key=lambda x:x['x'] )
  # Strip the latest month's incomplete analytics
  series['data'] = series['data'][:-1]
  return data
   
   
def _get_top_publishers(limit=20): def _get_top_publishers(limit=20):
''' '''
Returns a list of the top 20 publishers by dataset visits. Returns a list of the top 20 publishers by dataset visits.
(The number to show can be varied with 'limit') (The number to show can be varied with 'limit')
''' '''
  month = c.month or 'All'
connection = model.Session.connection() connection = model.Session.connection()
q = """ q = """
select department_id, sum(pageviews::int) views, sum(visitors::int) visits select department_id, sum(pageviews::int) views, sum(visits::int) visits
from ga_url from ga_url
where department_id <> ''""" where department_id <> ''
if c.month: and package_id <> ''
q = q + """ and url like '/dataset/%%'
and period_name=%s and period_name=%s
""" group by department_id order by views desc
q = q + """  
group by department_id order by visits desc  
""" """
if limit: if limit:
q = q + " limit %s;" % (limit) q = q + " limit %s;" % (limit)
   
# Add this back (before and period_name =%s) if you want to ignore publisher  
# homepage views  
# and not url like '/publisher/%%'  
   
top_publishers = [] top_publishers = []
res = connection.execute(q, c.month) res = connection.execute(q, month)
  department_ids = []
for row in res: for row in res:
g = model.Group.get(row[0]) g = model.Group.get(row[0])
if g: if g:
  department_ids.append(row[0])
top_publishers.append((g, row[1], row[2])) top_publishers.append((g, row[1], row[2]))
return top_publishers  
  graph = []
  if limit is not None:
  # Query for a history graph of these publishers
  q = model.Session.query(
  GA_Url.department_id,
  GA_Url.period_name,
  func.sum(cast(GA_Url.pageviews,sqlalchemy.types.INT)))\
  .filter( GA_Url.department_id.in_(department_ids) )\
  .filter( GA_Url.period_name!='All' )\
  .filter( GA_Url.url.like('/dataset/%') )\
  .filter( GA_Url.package_id!='' )\
  .group_by( GA_Url.department_id, GA_Url.period_name )
  graph_dict = {}
  for dept_id,period_name,views in q:
  graph_dict[dept_id] = graph_dict.get( dept_id, {
  'name' : model.Group.get(dept_id).title,
  'data' : []
  })
  graph_dict[dept_id]['data'].append({
  'x': _get_unix_epoch(period_name),
  'y': views
  })
  # Sort dict into ordered list
  for id in department_ids:
  graph.append( graph_dict[id] )
  return top_publishers, graph
   
   
def _get_publishers(): def _get_publishers():
''' '''
Returns a list of all publishers. Each item is a tuple: Returns a list of all publishers. Each item is a tuple:
(names, title) (name, title)
''' '''
publishers = [] publishers = []
for pub in model.Session.query(model.Group).\ for pub in model.Session.query(model.Group).\
filter(model.Group.type=='publisher').\ filter(model.Group.type=='publisher').\
filter(model.Group.state=='active').\ filter(model.Group.state=='active').\
order_by(model.Group.name): order_by(model.Group.name):
publishers.append((pub.name, pub.title)) publishers.append((pub.name, pub.title))
return publishers return publishers
   
def _percent(num, total): def _percent(num, total):
p = 100 * float(num)/float(total) p = 100 * float(num)/float(total)
return "%.2f%%" % round(p, 2) return "%.2f%%" % round(p, 2)
   
import os import os
import logging import logging
import datetime import datetime
import collections import collections
from pylons import config from pylons import config
  from ga_model import _normalize_url
import ga_model import ga_model
   
#from ga_client import GA #from ga_client import GA
   
log = logging.getLogger('ckanext.ga-report') log = logging.getLogger('ckanext.ga-report')
   
FORMAT_MONTH = '%Y-%m' FORMAT_MONTH = '%Y-%m'
MIN_VIEWS = 50 MIN_VIEWS = 50
MIN_VISITS = 20 MIN_VISITS = 20
  MIN_DOWNLOADS = 10
   
class DownloadAnalytics(object): class DownloadAnalytics(object):
'''Downloads and stores analytics info''' '''Downloads and stores analytics info'''
   
def __init__(self, service=None, profile_id=None, delete_first=False): def __init__(self, service=None, profile_id=None, delete_first=False,
  skip_url_stats=False):
self.period = config['ga-report.period'] self.period = config['ga-report.period']
self.service = service self.service = service
self.profile_id = profile_id self.profile_id = profile_id
self.delete_first = delete_first self.delete_first = delete_first
  self.skip_url_stats = skip_url_stats
   
def specific_month(self, date): def specific_month(self, date):
import calendar import calendar
   
first_of_this_month = datetime.datetime(date.year, date.month, 1) first_of_this_month = datetime.datetime(date.year, date.month, 1)
_, last_day_of_month = calendar.monthrange(int(date.year), int(date.month)) _, last_day_of_month = calendar.monthrange(int(date.year), int(date.month))
last_of_this_month = datetime.datetime(date.year, date.month, last_day_of_month) last_of_this_month = datetime.datetime(date.year, date.month, last_day_of_month)
  # if this is the latest month, note that it is only up until today
  now = datetime.datetime.now()
  if now.year == date.year and now.month == date.month:
  last_day_of_month = now.day
  last_of_this_month = now
periods = ((date.strftime(FORMAT_MONTH), periods = ((date.strftime(FORMAT_MONTH),
last_day_of_month, last_day_of_month,
first_of_this_month, last_of_this_month),) first_of_this_month, last_of_this_month),)
self.download_and_store(periods) self.download_and_store(periods)
   
   
def latest(self): def latest(self):
if self.period == 'monthly': if self.period == 'monthly':
# from first of this month to today # from first of this month to today
now = datetime.datetime.now() now = datetime.datetime.now()
first_of_this_month = datetime.datetime(now.year, now.month, 1) first_of_this_month = datetime.datetime(now.year, now.month, 1)
periods = ((now.strftime(FORMAT_MONTH), periods = ((now.strftime(FORMAT_MONTH),
now.day, now.day,
first_of_this_month, now),) first_of_this_month, now),)
else: else:
raise NotImplementedError raise NotImplementedError
self.download_and_store(periods) self.download_and_store(periods)
   
   
def for_date(self, for_date): def for_date(self, for_date):
assert isinstance(since_date, datetime.datetime) assert isinstance(since_date, datetime.datetime)
periods = [] # (period_name, period_complete_day, start_date, end_date) periods = [] # (period_name, period_complete_day, start_date, end_date)
if self.period == 'monthly': if self.period == 'monthly':
first_of_the_months_until_now = [] first_of_the_months_until_now = []
year = for_date.year year = for_date.year
month = for_date.month month = for_date.month
now = datetime.datetime.now() now = datetime.datetime.now()
first_of_this_month = datetime.datetime(now.year, now.month, 1) first_of_this_month = datetime.datetime(now.year, now.month, 1)
while True: while True:
first_of_the_month = datetime.datetime(year, month, 1) first_of_the_month = datetime.datetime(year, month, 1)
if first_of_the_month == first_of_this_month: if first_of_the_month == first_of_this_month:
periods.append((now.strftime(FORMAT_MONTH), periods.append((now.strftime(FORMAT_MONTH),
now.day, now.day,
first_of_this_month, now)) first_of_this_month, now))
break break
elif first_of_the_month < first_of_this_month: elif first_of_the_month < first_of_this_month:
in_the_next_month = first_of_the_month + datetime.timedelta(40) in_the_next_month = first_of_the_month + datetime.timedelta(40)
last_of_the_month = datetime.datetime(in_the_next_month.year, last_of_the_month = datetime.datetime(in_the_next_month.year,
in_the_next_month.month, 1)\ in_the_next_month.month, 1)\
- datetime.timedelta(1) - datetime.timedelta(1)
periods.append((now.strftime(FORMAT_MONTH), 0, periods.append((now.strftime(FORMAT_MONTH), 0,
first_of_the_month, last_of_the_month)) first_of_the_month, last_of_the_month))
else: else:
# first_of_the_month has got to the future somehow # first_of_the_month has got to the future somehow
break break
month += 1 month += 1
if month > 12: if month > 12:
year += 1 year += 1
month = 1 month = 1
else: else:
raise NotImplementedError raise NotImplementedError
self.download_and_store(periods) self.download_and_store(periods)
   
@staticmethod @staticmethod
def get_full_period_name(period_name, period_complete_day): def get_full_period_name(period_name, period_complete_day):
if period_complete_day: if period_complete_day:
return period_name + ' (up to %ith)' % period_complete_day return period_name + ' (up to %ith)' % period_complete_day
else: else:
return period_name return period_name
   
   
def download_and_store(self, periods): def download_and_store(self, periods):
for period_name, period_complete_day, start_date, end_date in periods: for period_name, period_complete_day, start_date, end_date in periods:
  log.info('Period "%s" (%s - %s)',
  self.get_full_period_name(period_name, period_complete_day),
  start_date.strftime('%Y-%m-%d'),
  end_date.strftime('%Y-%m-%d'))
   
if self.delete_first: if self.delete_first:
log.info('Deleting existing Analytics for period "%s"', log.info('Deleting existing Analytics for this period "%s"',
period_name) period_name)
ga_model.delete(period_name) ga_model.delete(period_name)
log.info('Downloading Analytics for period "%s" (%s - %s)',  
self.get_full_period_name(period_name, period_complete_day), if not self.skip_url_stats:
start_date.strftime('%Y %m %d'), # Clean out old url data before storing the new
end_date.strftime('%Y %m %d')) ga_model.pre_update_url_stats(period_name)
   
data = self.download(start_date, end_date, '~/dataset/[a-z0-9-_]+') accountName = config.get('googleanalytics.account')
log.info('Storing Dataset Analytics for period "%s"',  
self.get_full_period_name(period_name, period_complete_day)) log.info('Downloading analytics for dataset views')
self.store(period_name, period_complete_day, data, ) data = self.download(start_date, end_date, '~/%s/dataset/[a-z0-9-_]+' % accountName)
   
data = self.download(start_date, end_date, '~/publisher/[a-z0-9-_]+') log.info('Storing dataset views (%i rows)', len(data.get('url')))
log.info('Storing Publisher Analytics for period "%s"', self.store(period_name, period_complete_day, data, )
self.get_full_period_name(period_name, period_complete_day))  
self.store(period_name, period_complete_day, data,) log.info('Downloading analytics for publisher views')
  data = self.download(start_date, end_date, '~/%s/publisher/[a-z0-9-_]+' % accountName)
ga_model.update_publisher_stats(period_name) # about 30 seconds.  
self.sitewide_stats( period_name ) log.info('Storing publisher views (%i rows)', len(data.get('url')))
  self.store(period_name, period_complete_day, data,)
   
  # Make sure the All records are correct.
  ga_model.post_update_url_stats()
   
  log.info('Associating datasets with their publisher')
  ga_model.update_publisher_stats(period_name) # about 30 seconds.
   
   
  log.info('Downloading and storing analytics for site-wide stats')
  self.sitewide_stats( period_name, period_complete_day )
   
  log.info('Downloading and storing analytics for social networks')
self.update_social_info(period_name, start_date, end_date) self.update_social_info(period_name, start_date, end_date)
   
   
def update_social_info(self, period_name, start_date, end_date): def update_social_info(self, period_name, start_date, end_date):
start_date = start_date.strftime('%Y-%m-%d') start_date = start_date.strftime('%Y-%m-%d')
end_date = end_date.strftime('%Y-%m-%d') end_date = end_date.strftime('%Y-%m-%d')
query = 'ga:hasSocialSourceReferral=~Yes$' query = 'ga:hasSocialSourceReferral=~Yes$'
metrics = 'ga:entrances' metrics = 'ga:entrances'
sort = '-ga:entrances' sort = '-ga:entrances'
   
# Supported query params at # Supported query params at
# https://developers.google.com/analytics/devguides/reporting/core/v3/reference # https://developers.google.com/analytics/devguides/reporting/core/v3/reference
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
filters=query, filters=query,
start_date=start_date, start_date=start_date,
metrics=metrics, metrics=metrics,
sort=sort, sort=sort,
dimensions="ga:landingPagePath,ga:socialNetwork", dimensions="ga:landingPagePath,ga:socialNetwork",
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
data = collections.defaultdict(list) data = collections.defaultdict(list)
rows = results.get('rows',[]) rows = results.get('rows',[])
for row in rows: for row in rows:
from ga_model import _normalize_url url = _normalize_url('http:/' + row[0])
data[_normalize_url(row[0])].append( (row[1], int(row[2]),) ) data[url].append( (row[1], int(row[2]),) )
ga_model.update_social(period_name, data) ga_model.update_social(period_name, data)
   
   
def download(self, start_date, end_date, path='~/dataset/[a-z0-9-_]+'): def download(self, start_date, end_date, path=None):
'''Get data from GA for a given time period''' '''Get data from GA for a given time period'''
start_date = start_date.strftime('%Y-%m-%d') start_date = start_date.strftime('%Y-%m-%d')
end_date = end_date.strftime('%Y-%m-%d') end_date = end_date.strftime('%Y-%m-%d')
query = 'ga:pagePath=%s$' % path query = 'ga:pagePath=%s$' % path
metrics = 'ga:uniquePageviews, ga:visits' metrics = 'ga:pageviews, ga:visits'
sort = '-ga:uniquePageviews' sort = '-ga:pageviews'
   
# Supported query params at # Supported query params at
# https://developers.google.com/analytics/devguides/reporting/core/v3/reference # https://developers.google.com/analytics/devguides/reporting/core/v3/reference
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
filters=query, filters=query,
start_date=start_date, start_date=start_date,
metrics=metrics, metrics=metrics,
sort=sort, sort=sort,
dimensions="ga:pagePath", dimensions="ga:pagePath",
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
   
packages = [] packages = []
  log.info("There are %d results" % results['totalResults'])
for entry in results.get('rows'): for entry in results.get('rows'):
(loc,pageviews,visits) = entry (loc,pageviews,visits) = entry
packages.append( ('http:/' + loc, pageviews, visits,) ) # Temporary hack url = _normalize_url('http:/' + loc) # strips off domain e.g. www.data.gov.uk or data.gov.uk
   
  if not url.startswith('/dataset/') and not url.startswith('/publisher/'):
  # filter out strays like:
  # /data/user/login?came_from=http://data.gov.uk/dataset/os-code-point-open
  # /403.html?page=/about&from=http://data.gov.uk/publisher/planning-inspectorate
  continue
  packages.append( (url, pageviews, visits,) ) # Temporary hack
return dict(url=packages) return dict(url=packages)
   
def store(self, period_name, period_complete_day, data): def store(self, period_name, period_complete_day, data):
if 'url' in data: if 'url' in data:
ga_model.update_url_stats(period_name, period_complete_day, data['url']) ga_model.update_url_stats(period_name, period_complete_day, data['url'])
   
def sitewide_stats(self, period_name): def sitewide_stats(self, period_name, period_complete_day):
import calendar import calendar
year, month = period_name.split('-') year, month = period_name.split('-')
_, last_day_of_month = calendar.monthrange(int(year), int(month)) _, last_day_of_month = calendar.monthrange(int(year), int(month))
   
start_date = '%s-01' % period_name start_date = '%s-01' % period_name
end_date = '%s-%s' % (period_name, last_day_of_month) end_date = '%s-%s' % (period_name, last_day_of_month)
print 'Sitewide_stats for %s (%s -> %s)' % (period_name, start_date, end_date)  
   
funcs = ['_totals_stats', '_social_stats', '_os_stats', funcs = ['_totals_stats', '_social_stats', '_os_stats',
'_locale_stats', '_browser_stats', '_mobile_stats'] '_locale_stats', '_browser_stats', '_mobile_stats', '_download_stats']
for f in funcs: for f in funcs:
print ' + Fetching %s stats' % f.split('_')[1] log.info('Downloading analytics for %s' % f.split('_')[1])
getattr(self, f)(start_date, end_date, period_name) getattr(self, f)(start_date, end_date, period_name, period_complete_day)
   
def _get_results(result_data, f): def _get_results(result_data, f):
data = {} data = {}
for result in result_data: for result in result_data:
key = f(result) key = f(result)
data[key] = data.get(key,0) + result[1] data[key] = data.get(key,0) + result[1]
return data return data
   
def _totals_stats(self, start_date, end_date, period_name): def _totals_stats(self, start_date, end_date, period_name, period_complete_day):
""" Fetches distinct totals, total pageviews etc """ """ Fetches distinct totals, total pageviews etc """
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
start_date=start_date, start_date=start_date,
metrics='ga:uniquePageviews', metrics='ga:pageviews',
sort='-ga:uniquePageviews', sort='-ga:pageviews',
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
result_data = results.get('rows') result_data = results.get('rows')
ga_model.update_sitewide_stats(period_name, "Totals", {'Total page views': result_data[0][0]}) ga_model.update_sitewide_stats(period_name, "Totals", {'Total page views': result_data[0][0]},
  period_complete_day)
   
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
start_date=start_date, start_date=start_date,
metrics='ga:pageviewsPerVisit,ga:avgTimeOnSite,ga:percentNewVisits,ga:visits', metrics='ga:pageviewsPerVisit,ga:avgTimeOnSite,ga:percentNewVisits,ga:visits',
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
result_data = results.get('rows') result_data = results.get('rows')
data = { data = {
'Pages per visit': result_data[0][0], 'Pages per visit': result_data[0][0],
'Average time on site': result_data[0][1], 'Average time on site': result_data[0][1],
'New visits': result_data[0][2], 'New visits': result_data[0][2],
'Total visits': result_data[0][3], 'Total visits': result_data[0][3],
} }
ga_model.update_sitewide_stats(period_name, "Totals", data) ga_model.update_sitewide_stats(period_name, "Totals", data, period_complete_day)
   
# Bounces from /data. This url is specified in configuration because # Bounces from / or another configurable page.
# for DGU we don't want /. path = '/%s%s' % (config.get('googleanalytics.account'),
path = config.get('ga-report.bounce_url','/') config.get('ga-report.bounce_url', '/'))
print path results = self.service.data().ga().get(
results = self.service.data().ga().get( ids='ga:' + self.profile_id,
ids='ga:' + self.profile_id, filters='ga:pagePath==%s' % (path,),
filters='ga:pagePath=~%s$' % (path,), start_date=start_date,
start_date=start_date, metrics='ga:visitBounceRate',
metrics='ga:bounces,ga:uniquePageviews',  
dimensions='ga:pagePath', dimensions='ga:pagePath',
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
result_data = results.get('rows') result_data = results.get('rows')
for results in result_data: if not result_data or len(result_data) != 1:
if results[0] == path: log.error('Could not pinpoint the bounces for path: %s. Got results: %r',
bounce, total = [float(x) for x in results[1:]] path, result_data)
pct = 100 * bounce/total return
print "%d bounces from %d total == %s" % (bounce, total, pct) results = result_data[0]
ga_model.update_sitewide_stats(period_name, "Totals", {'Bounces': pct}) bounces = float(results[1])
  # visitBounceRate is already a %
  log.info('Google reports visitBounceRate as %s', bounces)
def _locale_stats(self, start_date, end_date, period_name): ga_model.update_sitewide_stats(period_name, "Totals", {'Bounce rate (home page)': float(bounces)},
  period_complete_day)
   
   
  def _locale_stats(self, start_date, end_date, period_name, period_complete_day):
""" Fetches stats about language and country """ """ Fetches stats about language and country """
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
start_date=start_date, start_date=start_date,
metrics='ga:uniquePageviews', metrics='ga:pageviews',
sort='-ga:uniquePageviews', sort='-ga:pageviews',
dimensions="ga:language,ga:country", dimensions="ga:language,ga:country",
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
result_data = results.get('rows') result_data = results.get('rows')
data = {} data = {}
for result in result_data: for result in result_data:
data[result[0]] = data.get(result[0], 0) + int(result[2]) data[result[0]] = data.get(result[0], 0) + int(result[2])
self._filter_out_long_tail(data, MIN_VIEWS) self._filter_out_long_tail(data, MIN_VIEWS)
ga_model.update_sitewide_stats(period_name, "Languages", data) ga_model.update_sitewide_stats(period_name, "Languages", data, period_complete_day)
   
data = {} data = {}
for result in result_data: for result in result_data:
data[result[1]] = data.get(result[1], 0) + int(result[2]) data[result[1]] = data.get(result[1], 0) + int(result[2])
self._filter_out_long_tail(data, MIN_VIEWS) self._filter_out_long_tail(data, MIN_VIEWS)
ga_model.update_sitewide_stats(period_name, "Country", data) ga_model.update_sitewide_stats(period_name, "Country", data, period_complete_day)
   
   
def _social_stats(self, start_date, end_date, period_name): def _download_stats(self, start_date, end_date, period_name, period_complete_day):
  """ Fetches stats about data downloads """
  import ckan.model as model
   
  data = {}
   
  results = self.service.data().ga().get(
  ids='ga:' + self.profile_id,
  start_date=start_date,
  filters='ga:eventAction==download',
  metrics='ga:totalEvents',
  sort='-ga:totalEvents',
  dimensions="ga:eventLabel",
  max_results=10000,
  end_date=end_date).execute()
  result_data = results.get('rows')
  if not result_data:
  # We may not have data for this time period, so we need to bail
  # early.
  log.info("There is no download data for this time period")
  return
   
  def process_result_data(result_data, cached=False):
  progress_total = len(result_data)
  progress_count = 0
  resources_not_matched = []
  for result in result_data:
  progress_count += 1
  if progress_count % 100 == 0:
  log.debug('.. %d/%d done so far', progress_count, progress_total)
   
  url = result[0].strip()
   
  # Get package id associated with the resource that has this URL.
  q = model.Session.query(model.Resource)
  if cached:
  r = q.filter(model.Resource.cache_url.like("%s%%" % url)).first()
  else:
  r = q.filter(model.Resource.url.like("%s%%" % url)).first()
   
  package_name = r.resource_group.package.name if r else ""
  if package_name:
  data[package_name] = data.get(package_name, 0) + int(result[1])
  else:
  resources_not_matched.append(url)
  continue
  if resources_not_matched:
  log.debug('Could not match %i or %i resource URLs to datasets. e.g. %r',
  len(resources_not_matched), progress_total, resources_not_matched[:3])
   
  log.info('Associating downloads of resource URLs with their respective datasets')
  process_result_data(results.get('rows'))
   
  results = self.service.data().ga().get(
  ids='ga:' + self.profile_id,
  start_date=start_date,
  filters='ga:eventAction==download-cache',
  metrics='ga:totalEvents',
  sort='-ga:totalEvents',
  dimensions="ga:eventLabel",
  max_results=10000,
  end_date=end_date).execute()
  log.info('Associating downloads of cache resource URLs with their respective datasets')
  process_result_data(results.get('rows'), cached=False)
   
  self._filter_out_long_tail(data, MIN_DOWNLOADS)
  ga_model.update_sitewide_stats(period_name, "Downloads", data, period_complete_day)
   
  def _social_stats(self, start_date, end_date, period_name, period_complete_day):
""" Finds out which social sites people are referred from """ """ Finds out which social sites people are referred from """
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
start_date=start_date, start_date=start_date,
metrics='ga:uniquePageviews', metrics='ga:pageviews',
sort='-ga:uniquePageviews', sort='-ga:pageviews',
dimensions="ga:socialNetwork,ga:referralPath", dimensions="ga:socialNetwork,ga:referralPath",
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
result_data = results.get('rows') result_data = results.get('rows')
data = {} data = {}
for result in result_data: for result in result_data:
if not result[0] == '(not set)': if not result[0] == '(not set)':
data[result[0]] = data.get(result[0], 0) + int(result[2]) data[result[0]] = data.get(result[0], 0) + int(result[2])
self._filter_out_long_tail(data, 3) self._filter_out_long_tail(data, 3)
ga_model.update_sitewide_stats(period_name, "Social sources", data) ga_model.update_sitewide_stats(period_name, "Social sources", data, period_complete_day)
   
   
def _os_stats(self, start_date, end_date, period_name): def _os_stats(self, start_date, end_date, period_name, period_complete_day):
""" Operating system stats """ """ Operating system stats """
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
start_date=start_date, start_date=start_date,
metrics='ga:uniquePageviews', metrics='ga:pageviews',
sort='-ga:uniquePageviews', sort='-ga:pageviews',
dimensions="ga:operatingSystem,ga:operatingSystemVersion", dimensions="ga:operatingSystem,ga:operatingSystemVersion",
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
result_data = results.get('rows') result_data = results.get('rows')
data = {} data = {}
for result in result_data: for result in result_data:
data[result[0]] = data.get(result[0], 0) + int(result[2]) data[result[0]] = data.get(result[0], 0) + int(result[2])
self._filter_out_long_tail(data, MIN_VIEWS) self._filter_out_long_tail(data, MIN_VIEWS)
ga_model.update_sitewide_stats(period_name, "Operating Systems", data) ga_model.update_sitewide_stats(period_name, "Operating Systems", data, period_complete_day)
   
data = {} data = {}
for result in result_data: for result in result_data:
if int(result[2]) >= MIN_VIEWS: if int(result[2]) >= MIN_VIEWS:
key = "%s %s" % (result[0],result[1]) key = "%s %s" % (result[0],result[1])
data[key] = result[2] data[key] = result[2]
ga_model.update_sitewide_stats(period_name, "Operating Systems versions", data) ga_model.update_sitewide_stats(period_name, "Operating Systems versions", data, period_complete_day)
   
   
def _browser_stats(self, start_date, end_date, period_name): def _browser_stats(self, start_date, end_date, period_name, period_complete_day):
""" Information about browsers and browser versions """ """ Information about browsers and browser versions """
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
start_date=start_date, start_date=start_date,
metrics='ga:uniquePageviews', metrics='ga:pageviews',
sort='-ga:uniquePageviews', sort='-ga:pageviews',
dimensions="ga:browser,ga:browserVersion", dimensions="ga:browser,ga:browserVersion",
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
result_data = results.get('rows') result_data = results.get('rows')
# e.g. [u'Firefox', u'19.0', u'20'] # e.g. [u'Firefox', u'19.0', u'20']
   
data = {} data = {}
for result in result_data: for result in result_data:
data[result[0]] = data.get(result[0], 0) + int(result[2]) data[result[0]] = data.get(result[0], 0) + int(result[2])
self._filter_out_long_tail(data, MIN_VIEWS) self._filter_out_long_tail(data, MIN_VIEWS)
ga_model.update_sitewide_stats(period_name, "Browsers", data) ga_model.update_sitewide_stats(period_name, "Browsers", data, period_complete_day)
   
data = {} data = {}
for result in result_data: for result in result_data:
key = "%s %s" % (result[0], self._filter_browser_version(result[0], result[1])) key = "%s %s" % (result[0], self._filter_browser_version(result[0], result[1]))
data[key] = data.get(key, 0) + int(result[2]) data[key] = data.get(key, 0) + int(result[2])
self._filter_out_long_tail(data, MIN_VIEWS) self._filter_out_long_tail(data, MIN_VIEWS)
ga_model.update_sitewide_stats(period_name, "Browser versions", data) ga_model.update_sitewide_stats(period_name, "Browser versions", data, period_complete_day)
   
@classmethod @classmethod
def _filter_browser_version(cls, browser, version_str): def _filter_browser_version(cls, browser, version_str):
''' '''
Simplifies a browser version string if it is detailed. Simplifies a browser version string if it is detailed.
i.e. groups together Firefox 3.5.1 and 3.5.2 to be just 3. i.e. groups together Firefox 3.5.1 and 3.5.2 to be just 3.
This is helpful when viewing stats and good to protect privacy. This is helpful when viewing stats and good to protect privacy.
''' '''
ver = version_str ver = version_str
parts = ver.split('.') parts = ver.split('.')
if len(parts) > 1: if len(parts) > 1:
if parts[1][0] == '0': if parts[1][0] == '0':
ver = parts[0] ver = parts[0]
else: else:
ver = "%s" % (parts[0]) ver = "%s" % (parts[0])
# Special case complex version nums # Special case complex version nums
if browser in ['Safari', 'Android Browser']: if browser in ['Safari', 'Android Browser']:
ver = parts[0] ver = parts[0]
if len(ver) > 2: if len(ver) > 2:
num_hidden_digits = len(ver) - 2 num_hidden_digits = len(ver) - 2
ver = ver[0] + ver[1] + 'X' * num_hidden_digits ver = ver[0] + ver[1] + 'X' * num_hidden_digits
return ver return ver
   
def _mobile_stats(self, start_date, end_date, period_name): def _mobile_stats(self, start_date, end_date, period_name, period_complete_day):
""" Info about mobile devices """ """ Info about mobile devices """
   
results = self.service.data().ga().get( results = self.service.data().ga().get(
ids='ga:' + self.profile_id, ids='ga:' + self.profile_id,
start_date=start_date, start_date=start_date,
metrics='ga:uniquePageviews', metrics='ga:pageviews',
sort='-ga:uniquePageviews', sort='-ga:pageviews',
dimensions="ga:mobileDeviceBranding, ga:mobileDeviceInfo", dimensions="ga:mobileDeviceBranding, ga:mobileDeviceInfo",
max_results=10000, max_results=10000,
end_date=end_date).execute() end_date=end_date).execute()
   
result_data = results.get('rows') result_data = results.get('rows')
data = {} data = {}
for result in result_data: for result in result_data:
data[result[0]] = data.get(result[0], 0) + int(result[2]) data[result[0]] = data.get(result[0], 0) + int(result[2])
self._filter_out_long_tail(data, MIN_VIEWS) self._filter_out_long_tail(data, MIN_VIEWS)
ga_model.update_sitewide_stats(period_name, "Mobile brands", data) ga_model.update_sitewide_stats(period_name, "Mobile brands", data, period_complete_day)
   
data = {} data = {}
for result in result_data: for result in result_data:
data[result[1]] = data.get(result[1], 0) + int(result[2]) data[result[1]] = data.get(result[1], 0) + int(result[2])
self._filter_out_long_tail(data, MIN_VIEWS) self._filter_out_long_tail(data, MIN_VIEWS)
ga_model.update_sitewide_stats(period_name, "Mobile devices", data) ga_model.update_sitewide_stats(period_name, "Mobile devices", data, period_complete_day)
   
@classmethod @classmethod
def _filter_out_long_tail(cls, data, threshold=10): def _filter_out_long_tail(cls, data, threshold=10):
''' '''
Given data which is a frequency distribution, filter out Given data which is a frequency distribution, filter out
results which are below a threshold count. This is good to protect results which are below a threshold count. This is good to protect
privacy. privacy.
''' '''
for key, value in data.items(): for key, value in data.items():
if value < threshold: if value < threshold:
del data[key] del data[key]
   
import re import re
import uuid import uuid
   
from sqlalchemy import Table, Column, MetaData, ForeignKey from sqlalchemy import Table, Column, MetaData, ForeignKey
from sqlalchemy import types from sqlalchemy import types
from sqlalchemy.sql import select from sqlalchemy.sql import select
from sqlalchemy.orm import mapper, relation from sqlalchemy.orm import mapper, relation
from sqlalchemy import func from sqlalchemy import func
   
import ckan.model as model import ckan.model as model
from ckan.lib.base import * from ckan.lib.base import *
   
  log = __import__('logging').getLogger(__name__)
   
def make_uuid(): def make_uuid():
return unicode(uuid.uuid4()) return unicode(uuid.uuid4())
   
metadata = MetaData() metadata = MetaData()
   
class GA_Url(object): class GA_Url(object):
   
def __init__(self, **kwargs): def __init__(self, **kwargs):
for k,v in kwargs.items(): for k,v in kwargs.items():
setattr(self, k, v) setattr(self, k, v)
   
url_table = Table('ga_url', metadata, url_table = Table('ga_url', metadata,
Column('id', types.UnicodeText, primary_key=True, Column('id', types.UnicodeText, primary_key=True,
default=make_uuid), default=make_uuid),
Column('period_name', types.UnicodeText), Column('period_name', types.UnicodeText),
Column('period_complete_day', types.Integer), Column('period_complete_day', types.Integer),
Column('pageviews', types.UnicodeText), Column('pageviews', types.UnicodeText),
Column('visitors', types.UnicodeText), Column('visits', types.UnicodeText),
Column('url', types.UnicodeText), Column('url', types.UnicodeText),
Column('department_id', types.UnicodeText), Column('department_id', types.UnicodeText),
Column('package_id', types.UnicodeText), Column('package_id', types.UnicodeText),
) )
mapper(GA_Url, url_table) mapper(GA_Url, url_table)
   
   
class GA_Stat(object): class GA_Stat(object):
   
def __init__(self, **kwargs): def __init__(self, **kwargs):
for k,v in kwargs.items(): for k,v in kwargs.items():
setattr(self, k, v) setattr(self, k, v)
   
stat_table = Table('ga_stat', metadata, stat_table = Table('ga_stat', metadata,
Column('id', types.UnicodeText, primary_key=True, Column('id', types.UnicodeText, primary_key=True,
default=make_uuid), default=make_uuid),
Column('period_name', types.UnicodeText), Column('period_name', types.UnicodeText),
  Column('period_complete_day', types.UnicodeText),
Column('stat_name', types.UnicodeText), Column('stat_name', types.UnicodeText),
Column('key', types.UnicodeText), Column('key', types.UnicodeText),
Column('value', types.UnicodeText), ) Column('value', types.UnicodeText), )
mapper(GA_Stat, stat_table) mapper(GA_Stat, stat_table)
   
   
class GA_Publisher(object): class GA_Publisher(object):
   
def __init__(self, **kwargs): def __init__(self, **kwargs):
for k,v in kwargs.items(): for k,v in kwargs.items():
setattr(self, k, v) setattr(self, k, v)
   
pub_table = Table('ga_publisher', metadata, pub_table = Table('ga_publisher', metadata,
Column('id', types.UnicodeText, primary_key=True, Column('id', types.UnicodeText, primary_key=True,
default=make_uuid), default=make_uuid),
Column('period_name', types.UnicodeText), Column('period_name', types.UnicodeText),
Column('publisher_name', types.UnicodeText), Column('publisher_name', types.UnicodeText),
Column('views', types.UnicodeText), Column('views', types.UnicodeText),
Column('visitors', types.UnicodeText), Column('visits', types.UnicodeText),
Column('toplevel', types.Boolean, default=False), Column('toplevel', types.Boolean, default=False),
Column('subpublishercount', types.Integer, default=0), Column('subpublishercount', types.Integer, default=0),
Column('parent', types.UnicodeText), Column('parent', types.UnicodeText),
) )
mapper(GA_Publisher, pub_table) mapper(GA_Publisher, pub_table)
   
   
class GA_ReferralStat(object): class GA_ReferralStat(object):
   
def __init__(self, **kwargs): def __init__(self, **kwargs):
for k,v in kwargs.items(): for k,v in kwargs.items():
setattr(self, k, v) setattr(self, k, v)
   
referrer_table = Table('ga_referrer', metadata, referrer_table = Table('ga_referrer', metadata,
Column('id', types.UnicodeText, primary_key=True, Column('id', types.UnicodeText, primary_key=True,
default=make_uuid), default=make_uuid),
Column('period_name', types.UnicodeText), Column('period_name', types.UnicodeText),
Column('source', types.UnicodeText), Column('source', types.UnicodeText),
Column('url', types.UnicodeText), Column('url', types.UnicodeText),
Column('count', types.Integer), Column('count', types.Integer),
) )
mapper(GA_ReferralStat, referrer_table) mapper(GA_ReferralStat, referrer_table)
   
   
   
def init_tables(): def init_tables():
metadata.create_all(model.meta.engine) metadata.create_all(model.meta.engine)
   
   
cached_tables = {} cached_tables = {}
   
   
def get_table(name): def get_table(name):
if name not in cached_tables: if name not in cached_tables:
meta = MetaData() meta = MetaData()
meta.reflect(bind=model.meta.engine) meta.reflect(bind=model.meta.engine)
table = meta.tables[name] table = meta.tables[name]
cached_tables[name] = table cached_tables[name] = table
return cached_tables[name] return cached_tables[name]
   
   
def _normalize_url(url): def _normalize_url(url):
'''Strip off the hostname etc. Do this before storing it. '''Strip off the hostname etc. Do this before storing it.
   
>>> normalize_url('http://data.gov.uk/dataset/weekly_fuel_prices') >>> normalize_url('http://data.gov.uk/dataset/weekly_fuel_prices')
'/dataset/weekly_fuel_prices' '/dataset/weekly_fuel_prices'
''' '''
# Deliberately leaving a / return '/' + '/'.join(url.split('/')[3:])
url = url.replace('http:/','')  
return '/' + '/'.join(url.split('/')[2:])  
  def _get_package_and_publisher(url):
   
def _get_department_id_of_url(url):  
# e.g. /dataset/fuel_prices # e.g. /dataset/fuel_prices
# e.g. /dataset/fuel_prices/resource/e63380d4 # e.g. /dataset/fuel_prices/resource/e63380d4
dataset_match = re.match('/dataset/([^/]+)(/.*)?', url) dataset_match = re.match('/dataset/([^/]+)(/.*)?', url)
if dataset_match: if dataset_match:
dataset_ref = dataset_match.groups()[0] dataset_ref = dataset_match.groups()[0]
dataset = model.Package.get(dataset_ref) dataset = model.Package.get(dataset_ref)
if dataset: if dataset:
publisher_groups = dataset.get_groups('publisher') publisher_groups = dataset.get_groups('publisher')
if publisher_groups: if publisher_groups:
return publisher_groups[0].name return dataset_ref,publisher_groups[0].name
  return dataset_ref, None
else: else:
publisher_match = re.match('/publisher/([^/]+)(/.*)?', url) publisher_match = re.match('/publisher/([^/]+)(/.*)?', url)
if publisher_match: if publisher_match:
return publisher_match.groups()[0] return None, publisher_match.groups()[0]
  return None, None
   
def update_sitewide_stats(period_name, stat_name, data): def update_sitewide_stats(period_name, stat_name, data, period_complete_day):
for k,v in data.iteritems(): for k,v in data.iteritems():
item = model.Session.query(GA_Stat).\ item = model.Session.query(GA_Stat).\
filter(GA_Stat.period_name==period_name).\ filter(GA_Stat.period_name==period_name).\
filter(GA_Stat.key==k).\ filter(GA_Stat.key==k).\
filter(GA_Stat.stat_name==stat_name).first() filter(GA_Stat.stat_name==stat_name).first()
if item: if item:
item.period_name = period_name item.period_name = period_name
item.key = k item.key = k
item.value = v item.value = v
  item.period_complete_day = period_complete_day
model.Session.add(item) model.Session.add(item)
else: else:
# create the row # create the row
values = {'id': make_uuid(), values = {'id': make_uuid(),
'period_name': period_name, 'period_name': period_name,
  'period_complete_day': period_complete_day,
'key': k, 'key': k,
'value': v, 'value': v,
'stat_name': stat_name 'stat_name': stat_name
} }
model.Session.add(GA_Stat(**values)) model.Session.add(GA_Stat(**values))
model.Session.commit() model.Session.commit()
   
   
  def pre_update_url_stats(period_name):
  q = model.Session.query(GA_Url).\
  filter(GA_Url.period_name==period_name)
  log.debug("Deleting %d '%s' records" % (q.count(), period_name))
  q.delete()
   
  q = model.Session.query(GA_Url).\
  filter(GA_Url.period_name == 'All')
  log.debug("Deleting %d 'All' records..." % q.count())
  q.delete()
   
  model.Session.flush()
  model.Session.commit()
  model.repo.commit_and_remove()
  log.debug('...done')
   
  def post_update_url_stats():
   
  """ Check the distinct url field in ga_url and make sure
  it has an All record. If not then create one.
   
  After running this then every URL should have an All
  record regardless of whether the URL has an entry for
  the month being currently processed.
  """
  log.debug('Post-processing "All" records...')
  query = """select url, pageviews::int, visits::int
  from ga_url
  where url not in (select url from ga_url where period_name ='All')"""
  connection = model.Session.connection()
  res = connection.execute(query)
   
  views, visits = {}, {}
  # url, views, visits
  for row in res:
  views[row[0]] = views.get(row[0], 0) + row[1]
  visits[row[0]] = visits.get(row[0], 0) + row[2]
   
  progress_total = len(views.keys())
  progress_count = 0
  for key in views.keys():
  progress_count += 1
  if progress_count % 100 == 0:
  log.debug('.. %d/%d done so far', progress_count, progress_total)
   
  package, publisher = _get_package_and_publisher(key)
   
  values = {'id': make_uuid(),
  'period_name': "All",
  'period_complete_day': 0,
  'url': key,
  'pageviews': views[key],
  'visits': visits[key],
  'department_id': publisher,
  'package_id': package
  }
  model.Session.add(GA_Url(**values))
  model.Session.commit()
  log.debug('..done')
   
   
def update_url_stats(period_name, period_complete_day, url_data): def update_url_stats(period_name, period_complete_day, url_data):
for url, views, visitors in url_data: '''
url = _normalize_url(url) Given a list of urls and number of hits for each during a given period,
department_id = _get_department_id_of_url(url) stores them in GA_Url under the period and recalculates the totals for
  the 'All' period.
package = None '''
if url.startswith('/dataset/'): progress_total = len(url_data)
package = url[len('/dataset/'):] progress_count = 0
  for url, views, visits in url_data:
# see if the row for this url & month is in the table already progress_count += 1
  if progress_count % 100 == 0:
  log.debug('.. %d/%d done so far', progress_count, progress_total)
   
  package, publisher = _get_package_and_publisher(url)
   
item = model.Session.query(GA_Url).\ item = model.Session.query(GA_Url).\
filter(GA_Url.period_name==period_name).\ filter(GA_Url.period_name==period_name).\
filter(GA_Url.url==url).first() filter(GA_Url.url==url).first()
if item: if item:
item.period_name = period_name item.pageviews = item.pageviews + views
item.pageviews = views item.visits = item.visits + visits
item.visitors = visitors if not item.package_id:
item.department_id = department_id item.package_id = package
item.package_id = package if not item.department_id:
  item.department_id = publisher
model.Session.add(item) model.Session.add(item)
else: else:
# create the row  
values = {'id': make_uuid(), values = {'id': make_uuid(),
'period_name': period_name, 'period_name': period_name,
'period_complete_day': period_complete_day, 'period_complete_day': period_complete_day,
'url': url, 'url': url,
'pageviews': views, 'pageviews': views,
'visitors': visitors, 'visits': visits,
'department_id': department_id, 'department_id': publisher,
'package_id': package 'package_id': package
} }
model.Session.add(GA_Url(**values)) model.Session.add(GA_Url(**values))
   
# We now need to recaculate the ALL time_period from the data we have  
# Delete the old 'All'  
old = model.Session.query(GA_Url).\  
filter(GA_Url.period_name == "All").\  
filter(GA_Url.url==url).delete()  
   
items = model.Session.query(GA_Url).\  
filter(GA_Url.period_name != "All").\  
filter(GA_Url.url==url).all()  
values = {'id': make_uuid(),  
'period_name': "All",  
'period_complete_day': "0",  
'url': url,  
'pageviews': sum([int(x.pageviews) for x in items]),  
'visitors': sum([int(x.visitors) for x in items]),  
'department_id': department_id,  
'package_id': package  
}  
model.Session.add(GA_Url(**values))  
   
model.Session.commit() model.Session.commit()
   
  if package:
  old_pageviews, old_visits = 0, 0
  old = model.Session.query(GA_Url).\
  filter(GA_Url.period_name=='All').\
  filter(GA_Url.url==url).all()
  old_pageviews = sum([int(o.pageviews) for o in old])
  old_visits = sum([int(o.visits) for o in old])
   
  entries = model.Session.query(GA_Url).\
  filter(GA_Url.period_name!='All').\
  filter(GA_Url.url==url).all()
  values = {'id': make_uuid(),
  'period_name': 'All',
  'period_complete_day': 0,
  'url': url,
  'pageviews': sum([int(e.pageviews) for e in entries]) + int(old_pageviews),
  'visits': sum([int(e.visits or 0) for e in entries]) + int(old_visits),
  'department_id': publisher,
  'package_id': package
  }
   
  model.Session.add(GA_Url(**values))
  model.Session.commit()
   
   
   
   
def update_social(period_name, data): def update_social(period_name, data):
# Clean up first. # Clean up first.
model.Session.query(GA_ReferralStat).\ model.Session.query(GA_ReferralStat).\
filter(GA_ReferralStat.period_name==period_name).delete() filter(GA_ReferralStat.period_name==period_name).delete()
   
for url,data in data.iteritems(): for url,data in data.iteritems():
for entry in data: for entry in data:
source = entry[0] source = entry[0]
count = entry[1] count = entry[1]
   
item = model.Session.query(GA_ReferralStat).\ item = model.Session.query(GA_ReferralStat).\
filter(GA_ReferralStat.period_name==period_name).\ filter(GA_ReferralStat.period_name==period_name).\
filter(GA_ReferralStat.source==source).\ filter(GA_ReferralStat.source==source).\
filter(GA_ReferralStat.url==url).first() filter(GA_ReferralStat.url==url).first()
if item: if item:
item.count = item.count + count item.count = item.count + count
model.Session.add(item) model.Session.add(item)
else: else:
# create the row # create the row
values = {'id': make_uuid(), values = {'id': make_uuid(),
'period_name': period_name, 'period_name': period_name,
'source': source, 'source': source,
'url': url, 'url': url,
'count': count, 'count': count,
} }
model.Session.add(GA_ReferralStat(**values)) model.Session.add(GA_ReferralStat(**values))
model.Session.commit() model.Session.commit()
   
def update_publisher_stats(period_name): def update_publisher_stats(period_name):
""" """
Updates the publisher stats from the data retrieved for /dataset/* Updates the publisher stats from the data retrieved for /dataset/*
and /publisher/*. Will run against each dataset and generates the and /publisher/*. Will run against each dataset and generates the
totals for the entire tree beneath each publisher. totals for the entire tree beneath each publisher.
""" """
toplevel = get_top_level() toplevel = get_top_level()
publishers = model.Session.query(model.Group).\ publishers = model.Session.query(model.Group).\
filter(model.Group.type=='publisher').\ filter(model.Group.type=='publisher').\
filter(model.Group.state=='active').all() filter(model.Group.state=='active').all()
for publisher in publishers: for publisher in publishers:
views, visitors, subpub = update_publisher(period_name, publisher, publisher.name) views, visits, subpub = update_publisher(period_name, publisher, publisher.name)
parent, parents = '', publisher.get_groups('publisher') parent, parents = '', publisher.get_groups('publisher')
if parents: if parents:
parent = parents[0].name parent = parents[0].name
item = model.Session.query(GA_Publisher).\ item = model.Session.query(GA_Publisher).\
filter(GA_Publisher.period_name==period_name).\ filter(GA_Publisher.period_name==period_name).\
filter(GA_Publisher.publisher_name==publisher.name).first() filter(GA_Publisher.publisher_name==publisher.name).first()
if item: if item:
item.views = views item.views = views
item.visitors = visitors item.visits = visits
item.publisher_name = publisher.name item.publisher_name = publisher.name
item.toplevel = publisher in toplevel item.toplevel = publisher in toplevel
item.subpublishercount = subpub item.subpublishercount = subpub
item.parent = parent item.parent = parent
model.Session.add(item) model.Session.add(item)
else: else:
# create the row # create the row
values = {'id': make_uuid(), values = {'id': make_uuid(),
'period_name': period_name, 'period_name': period_name,
'publisher_name': publisher.name, 'publisher_name': publisher.name,
'views': views, 'views': views,
'visitors': visitors, 'visits': visits,
'toplevel': publisher in toplevel, 'toplevel': publisher in toplevel,
'subpublishercount': subpub, 'subpublishercount': subpub,
'parent': parent 'parent': parent
} }
model.Session.add(GA_Publisher(**values)) model.Session.add(GA_Publisher(**values))
model.Session.commit() model.Session.commit()
   
   
def update_publisher(period_name, pub, part=''): def update_publisher(period_name, pub, part=''):
views,visitors,subpub = 0, 0, 0 views,visits,subpub = 0, 0, 0
for publisher in go_down_tree(pub): for publisher in go_down_tree(pub):
subpub = subpub + 1 subpub = subpub + 1
items = model.Session.query(GA_Url).\ items = model.Session.query(GA_Url).\
filter(GA_Url.period_name==period_name).\ filter(GA_Url.period_name==period_name).\
filter(GA_Url.department_id==publisher.name).all() filter(GA_Url.department_id==publisher.name).all()
for item in items: for item in items:
views = views + int(item.pageviews) views = views + int(item.pageviews)
visitors = visitors + int(item.visitors) visits = visits + int(item.visits)
   
return views, visitors, (subpub-1) return views, visits, (subpub-1)
   
   
def get_top_level(): def get_top_level():
'''Returns the top level publishers.''' '''Returns the top level publishers.'''
return model.Session.query(model.Group).\ return model.Session.query(model.Group).\
outerjoin(model.Member, model.Member.table_id == model.Group.id and \ outerjoin(model.Member, model.Member.table_id == model.Group.id and \
model.Member.table_name == 'group' and \ model.Member.table_name == 'group' and \
model.Member.state == 'active').\ model.Member.state == 'active').\
filter(model.Member.id==None).\ filter(model.Member.id==None).\
filter(model.Group.type=='publisher').\ filter(model.Group.type=='publisher').\
order_by(model.Group.name).all() order_by(model.Group.name).all()
   
def get_children(publisher): def get_children(publisher):
'''Finds child publishers for the given publisher (object). (Not recursive)''' '''Finds child publishers for the given publisher (object). (Not recursive)'''
from ckan.model.group import HIERARCHY_CTE from ckan.model.group import HIERARCHY_CTE
return model.Session.query(model.Group).\ return model.Session.query(model.Group).\
from_statement(HIERARCHY_CTE).params(id=publisher.id, type='publisher').\ from_statement(HIERARCHY_CTE).params(id=publisher.id, type='publisher').\
all() all()
   
def go_down_tree(publisher): def go_down_tree(publisher):
'''Provided with a publisher object, it walks down the hierarchy and yields each publisher, '''Provided with a publisher object, it walks down the hierarchy and yields each publisher,
including the one you supply.''' including the one you supply.'''
yield publisher yield publisher
for child in get_children(publisher): for child in get_children(publisher):
for grandchild in go_down_tree(child): for grandchild in go_down_tree(child):
yield grandchild yield grandchild
   
def delete(period_name): def delete(period_name):
''' '''
Deletes table data for the specified period, or specify 'all' Deletes table data for the specified period, or specify 'all'
for all periods. for all periods.
''' '''
for object_type in (GA_Url, GA_Stat, GA_Publisher, GA_ReferralStat): for object_type in (GA_Url, GA_Stat, GA_Publisher, GA_ReferralStat):
q = model.Session.query(object_type) q = model.Session.query(object_type)
if period_name != 'all': if period_name != 'All':
q = q.filter_by(period_name=period_name) q = q.filter_by(period_name=period_name)
q.delete() q.delete()
model.Session.commit() model.repo.commit_and_remove()
   
  def get_score_for_dataset(dataset_name):
  '''
  Returns a "current popularity" score for a dataset,
  based on how many views it has had recently.
  '''
  import datetime
  now = datetime.datetime.now()
  last_month = now - datetime.timedelta(days=30)
  period_names = ['%s-%02d' % (last_month.year, last_month.month),
  '%s-%02d' % (now.year, now.month),
  ]
   
  score = 0
  for period_name in period_names:
  score /= 2 # previous periods are discounted by 50%
  entry = model.Session.query(GA_Url)\
  .filter(GA_Url.period_name==period_name)\
  .filter(GA_Url.package_id==dataset_name).first()
  # score
  if entry:
  views = float(entry.pageviews)
  if entry.period_complete_day:
  views_per_day = views / entry.period_complete_day
  else:
  views_per_day = views / 15 # guess
  score += views_per_day
   
  score = int(score * 100)
  log.debug('Popularity %s: %s', score, dataset_name)
  return score
   
import logging import logging
import operator import operator
   
import ckan.lib.base as base import ckan.lib.base as base
import ckan.model as model import ckan.model as model
from ckan.logic import get_action from ckan.logic import get_action
   
from ckanext.ga_report.ga_model import GA_Url, GA_Publisher from ckanext.ga_report.ga_model import GA_Url, GA_Publisher
from ckanext.ga_report.controller import _get_publishers from ckanext.ga_report.controller import _get_publishers
_log = logging.getLogger(__name__) _log = logging.getLogger(__name__)
   
def popular_datasets(count=10): def popular_datasets(count=10):
import random import random
   
publisher = None publisher = None
publishers = _get_publishers(30) publishers = _get_publishers(30)
total = len(publishers) total = len(publishers)
while not publisher or not datasets: while not publisher or not datasets:
rand = random.randrange(0, total) rand = random.randrange(0, total)
publisher = publishers[rand][0] publisher = publishers[rand][0]
if not publisher.state == 'active': if not publisher.state == 'active':
publisher = None publisher = None
continue continue
datasets = _datasets_for_publisher(publisher, 10)[:count] datasets = _datasets_for_publisher(publisher, 10)[:count]
   
ctx = { ctx = {
'datasets': datasets, 'datasets': datasets,
'publisher': publisher 'publisher': publisher
} }
return base.render_snippet('ga_report/ga_popular_datasets.html', **ctx) return base.render_snippet('ga_report/ga_popular_datasets.html', **ctx)
   
def single_popular_dataset(top=20): def single_popular_dataset(top=20):
'''Returns a random dataset from the most popular ones. '''Returns a random dataset from the most popular ones.
   
:param top: the number of top datasets to select from :param top: the number of top datasets to select from
''' '''
import random import random
   
top_datasets = model.Session.query(GA_Url).\ top_datasets = model.Session.query(GA_Url).\
filter(GA_Url.url.like('/dataset/%')).\ filter(GA_Url.url.like('/dataset/%')).\
order_by('ga_url.pageviews::int desc') order_by('ga_url.pageviews::int desc')
num_top_datasets = top_datasets.count() num_top_datasets = top_datasets.count()
   
dataset = None dataset = None
if num_top_datasets: if num_top_datasets:
count = 0 count = 0
while not dataset: while not dataset:
rand = random.randrange(0, min(top, num_top_datasets)) rand = random.randrange(0, min(top, num_top_datasets))
ga_url = top_datasets[rand] ga_url = top_datasets[rand]
dataset = model.Package.get(ga_url.url[len('/dataset/'):]) dataset = model.Package.get(ga_url.url[len('/dataset/'):])
if dataset and not dataset.state == 'active': if dataset and not dataset.state == 'active':
dataset = None dataset = None
count += 1 # When testing, it is possible that top datasets are not available
if count > 10: # so only go round this loop a few times before falling back on
break # a random dataset.
  count += 1
  if count > 10:
  break
if not dataset: if not dataset:
# fallback # fallback
dataset = model.Session.query(model.Package)\ dataset = model.Session.query(model.Package)\
.filter_by(state='active').first() .filter_by(state='active').first()
if not dataset: if not dataset:
return None return None
dataset_dict = get_action('package_show')({'model': model, dataset_dict = get_action('package_show')({'model': model,
'session': model.Session}, 'session': model.Session,
  'validate': False},
{'id':dataset.id}) {'id':dataset.id})
return dataset_dict return dataset_dict
   
def single_popular_dataset_html(top=20): def single_popular_dataset_html(top=20):
dataset_dict = single_popular_dataset(top) dataset_dict = single_popular_dataset(top)
groups = package.get('groups', []) groups = package.get('groups', [])
publishers = [ g for g in groups if g.get('type') == 'publisher' ] publishers = [ g for g in groups if g.get('type') == 'publisher' ]
publisher = publishers[0] if publishers else {'name':'', 'title': ''} publisher = publishers[0] if publishers else {'name':'', 'title': ''}
context = { context = {
'dataset': dataset_dict, 'dataset': dataset_dict,
'publisher': publisher_dict 'publisher': publisher_dict
} }
return base.render_snippet('ga_report/ga_popular_single.html', **context) return base.render_snippet('ga_report/ga_popular_single.html', **context)
   
   
def most_popular_datasets(publisher, count=20): def most_popular_datasets(publisher, count=20):
   
if not publisher: if not publisher:
_log.error("No valid publisher passed to 'most_popular_datasets'") _log.error("No valid publisher passed to 'most_popular_datasets'")
return "" return ""
   
results = _datasets_for_publisher(publisher, count) results = _datasets_for_publisher(publisher, count)
   
ctx = { ctx = {
'dataset_count': len(results), 'dataset_count': len(results),
'datasets': results, 'datasets': results,
   
'publisher': publisher 'publisher': publisher
} }
   
return base.render_snippet('ga_report/publisher/popular.html', **ctx) return base.render_snippet('ga_report/publisher/popular.html', **ctx)
   
def _datasets_for_publisher(publisher, count): def _datasets_for_publisher(publisher, count):
datasets = {} datasets = {}
entries = model.Session.query(GA_Url).\ entries = model.Session.query(GA_Url).\
filter(GA_Url.department_id==publisher.name).\ filter(GA_Url.department_id==publisher.name).\
filter(GA_Url.url.like('/dataset/%')).\ filter(GA_Url.url.like('/dataset/%')).\
order_by('ga_url.pageviews::int desc').all() order_by('ga_url.pageviews::int desc').all()
for entry in entries: for entry in entries:
if len(datasets) < count: if len(datasets) < count:
p = model.Package.get(entry.url[len('/dataset/'):]) p = model.Package.get(entry.url[len('/dataset/'):])
  if not p:
  _log.warning("Could not find Package for {url}".format(url=entry.url))
  continue
   
if not p in datasets: if not p in datasets:
datasets[p] = {'views':0, 'visits': 0} datasets[p] = {'views':0, 'visits': 0}
datasets[p]['views'] = datasets[p]['views'] + int(entry.pageviews) datasets[p]['views'] = datasets[p]['views'] + int(entry.pageviews)
datasets[p]['visits'] = datasets[p]['visits'] + int(entry.visitors) datasets[p]['visits'] = datasets[p]['visits'] + int(entry.visits)
   
results = [] results = []
for k, v in datasets.iteritems(): for k, v in datasets.iteritems():
results.append((k,v['views'],v['visits'])) results.append((k,v['views'],v['visits']))
   
return sorted(results, key=operator.itemgetter(1), reverse=True) return sorted(results, key=operator.itemgetter(1), reverse=True)
   
import logging import logging
import ckan.lib.helpers as h import ckan.lib.helpers as h
import ckan.plugins as p import ckan.plugins as p
from ckan.plugins import implements, toolkit from ckan.plugins import implements, toolkit
   
from ckanext.ga_report.helpers import (most_popular_datasets, from ckanext.ga_report.helpers import (most_popular_datasets,
popular_datasets, popular_datasets,
single_popular_dataset) single_popular_dataset)
   
log = logging.getLogger('ckanext.ga-report') log = logging.getLogger('ckanext.ga-report')
   
class GAReportPlugin(p.SingletonPlugin): class GAReportPlugin(p.SingletonPlugin):
implements(p.IConfigurer, inherit=True) implements(p.IConfigurer, inherit=True)
implements(p.IRoutes, inherit=True) implements(p.IRoutes, inherit=True)
implements(p.ITemplateHelpers, inherit=True) implements(p.ITemplateHelpers, inherit=True)
   
def update_config(self, config): def update_config(self, config):
toolkit.add_template_directory(config, 'templates') toolkit.add_template_directory(config, 'templates')
toolkit.add_public_directory(config, 'public') toolkit.add_public_directory(config, 'public')
   
def get_helpers(self): def get_helpers(self):
""" """
A dictionary of extra helpers that will be available to provide A dictionary of extra helpers that will be available to provide
ga report info to templates. ga report info to templates.
""" """
return { return {
'ga_report_installed': lambda: True, 'ga_report_installed': lambda: True,
'popular_datasets': popular_datasets, 'popular_datasets': popular_datasets,
'most_popular_datasets': most_popular_datasets, 'most_popular_datasets': most_popular_datasets,
'single_popular_dataset': single_popular_dataset 'single_popular_dataset': single_popular_dataset
} }
   
def after_map(self, map): def after_map(self, map):
# GaReport # GaReport
map.connect( map.connect(
'/data/site-usage', '/data/site-usage',
controller='ckanext.ga_report.controller:GaReport', controller='ckanext.ga_report.controller:GaReport',
action='index' action='index'
) )
map.connect( map.connect(
'/data/site-usage/data_{month}.csv', '/data/site-usage/data_{month}.csv',
controller='ckanext.ga_report.controller:GaReport', controller='ckanext.ga_report.controller:GaReport',
action='csv' action='csv'
) )
  map.connect(
  '/data/site-usage/downloads',
  controller='ckanext.ga_report.controller:GaReport',
  action='downloads'
  )
  map.connect(
  '/data/site-usage/downloads_{month}.csv',
  controller='ckanext.ga_report.controller:GaReport',
  action='csv_downloads'
  )
   
# GaDatasetReport # GaDatasetReport
map.connect( map.connect(
'/data/site-usage/publisher', '/data/site-usage/publisher',
controller='ckanext.ga_report.controller:GaDatasetReport', controller='ckanext.ga_report.controller:GaDatasetReport',
action='publishers' action='publishers'
) )
map.connect( map.connect(
'/data/site-usage/publishers_{month}.csv', '/data/site-usage/publishers_{month}.csv',
controller='ckanext.ga_report.controller:GaDatasetReport', controller='ckanext.ga_report.controller:GaDatasetReport',
action='publisher_csv' action='publisher_csv'
) )
map.connect( map.connect(
'/data/site-usage/dataset/datasets_{id}_{month}.csv', '/data/site-usage/dataset/datasets_{id}_{month}.csv',
controller='ckanext.ga_report.controller:GaDatasetReport', controller='ckanext.ga_report.controller:GaDatasetReport',
action='dataset_csv' action='dataset_csv'
) )
map.connect( map.connect(
'/data/site-usage/dataset', '/data/site-usage/dataset',
controller='ckanext.ga_report.controller:GaDatasetReport', controller='ckanext.ga_report.controller:GaDatasetReport',
action='read' action='read'
) )
map.connect( map.connect(
'/data/site-usage/dataset/{id}', '/data/site-usage/dataset/{id}',
controller='ckanext.ga_report.controller:GaDatasetReport', controller='ckanext.ga_report.controller:GaDatasetReport',
action='read_publisher' action='read_publisher'
) )
return map return map
   
   
  .table-condensed td.sparkline-cell {
  padding: 1px 0 0 0;
  width: 108px;
  text-align: center;
  /* Hack to hide the momentary flash of text
  * before sparklines are fully rendered */
  font-size: 1px;
  color: transparent;
  overflow: hidden;
  }
  .rickshaw_chart_container {
  position: relative;
  height: 350px;
  margin: 0 auto 20px auto;
  }
  .rickshaw_chart {
  position: absolute;
  left: 40px;
  width: 500px;
  top: 0;
  bottom: 0;
  }
  .rickshaw_legend {
  background: transparent;
  width: 100%;
  padding-top: 4px;
  }
  .rickshaw_y_axis {
  position: absolute;
  top: 0;
  bottom: 0;
  width: 40px;
  }
  .rickshaw_legend .label {
  background: transparent !important;
  color: #000000 !important;
  font-weight: normal !important;
  }
  .rickshaw_legend .instructions {
  color: #000;
  margin-bottom: 6px;
  }
 
  .rickshaw_legend .line .action {
  display: none;
  }
  .rickshaw_legend .line .swatch {
  display: block;
  float: left;
  }
  .rickshaw_legend .line .label {
  display: block;
  white-space: normal;
  float: left;
  width: 200px;
  }
  .rickshaw_legend .line .label:hover {
  text-decoration: underline;
  }
 
  .ga-reports-table .td-numeric {
  text-align: center;
  }
 
  var CKAN = CKAN || {};
  CKAN.GA_Reports = {};
 
  CKAN.GA_Reports.render_rickshaw = function( css_name, data, mode, colorscheme ) {
  var graphLegends = $('#graph-legend-container');
 
  if (!Modernizr.svg) {
  $("#chart_"+css_name)
  .html( '<div class="alert">Your browser does not support vector graphics. No graphs can be rendered.</div>')
  .closest('.rickshaw_chart_container').css('height',50);
  var myLegend = $('<div id="legend_'+css_name+'"/>')
  .html('(Graph cannot be rendered)')
  .appendTo(graphLegends);
  return;
  }
  var myLegend = $('<div id="legend_'+css_name+'"/>').appendTo(graphLegends);
 
  var palette = new Rickshaw.Color.Palette( { scheme: colorscheme } );
  $.each(data, function(i, object) {
  object['color'] = palette.color();
  });
  // Rickshaw renders the legend in reverse order...
  data.reverse();
 
  var graphElement = document.querySelector("#chart_"+css_name);
 
  var graph = new Rickshaw.Graph( {
  element: document.querySelector("#chart_"+css_name),
  renderer: mode,
  series: data ,
  height: 328
  });
  var x_axis = new Rickshaw.Graph.Axis.Time( { graph: graph } );
  var y_axis = new Rickshaw.Graph.Axis.Y( {
  graph: graph,
  orientation: 'left',
  tickFormat: Rickshaw.Fixtures.Number.formatKMBT,
  element: document.getElementById('y_axis_'+css_name)
  } );
  var legend = new Rickshaw.Graph.Legend( {
  element: document.querySelector('#legend_'+css_name),
  graph: graph
  } );
  var shelving = new Rickshaw.Graph.Behavior.Series.Toggle( {
  graph: graph,
  legend: legend
  } );
  myLegend.prepend('<div class="instructions">Click on a series below to isolate its graph:</div>');
  graph.render();
  };
 
  CKAN.GA_Reports.bind_sparklines = function() {
  /*
  * Bind to the 'totals' tab being on screen, when the
  * Sparkline graphs should be drawn.
  * Note that they cannot be drawn sooner.
  */
  var created = false;
  $('a[href="#totals"]').on(
  'shown',
  function() {
  if (!created) {
  var sparkOptions = {
  enableTagOptions: true,
  type: 'line',
  width: 100,
  height: 26,
  chartRangeMin: 0,
  spotColor: '',
  maxSpotColor: '',
  minSpotColor: '',
  highlightSpotColor: '#000000',
  lineColor: '#3F8E6D',
  fillColor: '#B7E66B'
  };
  $('.sparkline').sparkline('html',sparkOptions);
  created = true;
  }
  $.sparkline_display_visible();
  }
  );
  };
 
  CKAN.GA_Reports.bind_sidebar = function() {
  /*
  * Bind to changes in the tab behaviour:
  * Show the correct rickshaw graph in the sidebar.
  * Not to be called before all graphs load.
  */
  $('a[data-toggle="hashtab"]').on(
  'shown',
  function(e) {
  var href = $(e.target).attr('href');
  var pane = $(href);
  if (!pane.length) { console.err('bad href',href); return; }
  var legend_name = "none";
  var graph = pane.find('.rickshaw_chart');
  if (graph.length) {
  legend_name = graph.attr('id').replace('chart_','');
  }
  legend_name = '#legend_'+legend_name;
  $('#graph-legend-container > *').hide();
  $('#graph-legend-container .instructions').show();
  $(legend_name).show();
  }
  );
  /* The first tab might already have been shown */
  $('li.active > a[data-toggle="hashtab"]').trigger('shown');
  };
 
  CKAN.GA_Reports.bind_month_selector = function() {
  var handler = function(e) {
  var target = $(e.delegateTarget);
  var form = target.closest('form');
  var url = form.attr('action')+'?month='+target.val()+window.location.hash;
  window.location = url;
  };
  var selectors = $('select[name="month"]');
  assert(selectors.length>0);
  selectors.bind('change', handler);
  };
 
  /*
  * Collection of shims to allow d3 and Rickshaw to load, error-free
  * (but ultimately unusable) on Internet Explorer 7. The browser's
  * API lacks several crucial functions which these libraries depend
  * upon to load; we try to hide these errors from the user.
  *
  * With thanks to Array functions from:
  * http://stackoverflow.com/questions/2790001/fixing-javascript-array-functions-in-internet-explorer-indexof-foreach-etc
  *
  * Use (Modernizr.svg==true) to detect whether it's okay to draw a graph.
  */
  'use strict';
 
  window.Element = window.Element || {'prototype': {}};
  window.CSSStyleDeclaration = window.CSSStyleDeclaration || {'prototype':{}};
 
  // Add ECMA262-5 method binding if not supported natively
  //
  if (!('bind' in Function.prototype)) {
  Function.prototype.bind= function(owner) {
  var that= this;
  if (arguments.length<=1) {
  return function() {
  return that.apply(owner, arguments);
  };
  } else {
  var args= Array.prototype.slice.call(arguments, 1);
  return function() {
  return that.apply(owner, arguments.length===0? args : args.concat(Array.prototype.slice.call(arguments)));
  };
  }
  };
  }
 
  // Add ECMA262-5 string trim if not supported natively
  //
  if (!('trim' in String.prototype)) {
  String.prototype.trim= function() {
  return this.replace(/^\s+/, '').replace(/\s+$/, '');
  };
  }
 
  // Add ECMA262-5 Array methods if not supported natively
  //
  if (!('indexOf' in Array.prototype)) {
  Array.prototype.indexOf= function(find, i /*opt*/) {
  if (i===undefined) i= 0;
  if (i<0) i+= this.length;
  if (i<0) i= 0;
  for (var n= this.length; i<n; i++)
  if (i in this && this[i]===find)
  return i;
  return -1;
  };
  }
  if (!('lastIndexOf' in Array.prototype)) {
  Array.prototype.lastIndexOf= function(find, i /*opt*/) {
  if (i===undefined) i= this.length-1;
  if (i<0) i+= this.length;
  if (i>this.length-1) i= this.length-1;
  for (i++; i-->0;) /* i++ because from-argument is sadly inclusive */
  if (i in this && this[i]===find)
  return i;
  return -1;
  };
  }
  if (!('forEach' in Array.prototype)) {
  Array.prototype.forEach= function(action, that /*opt*/) {
  for (var i= 0, n= this.length; i<n; i++)
  if (i in this)
  action.call(that, this[i], i, this);
  };
  }
  if (!('map' in Array.prototype)) {
  Array.prototype.map= function(mapper, that /*opt*/) {
  var other= new Array(this.length);
  for (var i= 0, n= this.length; i<n; i++)
  if (i in this)
  other[i]= mapper.call(that, this[i], i, this);
  return other;
  };
  }
  if (!('filter' in Array.prototype)) {
  Array.prototype.filter= function(filter, that /*opt*/) {
  var other= [], v;
  for (var i=0, n= this.length; i<n; i++)
  if (i in this && filter.call(that, v= this[i], i, this))
  other.push(v);
  return other;
  };
  }
  if (!('every' in Array.prototype)) {
  Array.prototype.every= function(tester, that /*opt*/) {
  for (var i= 0, n= this.length; i<n; i++)
  if (i in this && !tester.call(that, this[i], i, this))
  return false;
  return true;
  };
  }
  if (!('some' in Array.prototype)) {
  Array.prototype.some= function(tester, that /*opt*/) {
  for (var i= 0, n= this.length; i<n; i++)
  if (i in this && tester.call(that, this[i], i, this))
  return true;
  return false;
  };
  }
 
 
  (function(){function a(a){var b=a.source,d=a.target,e=c(b,d),f=[b];while(b!==e)b=b.parent,f.push(b);var g=f.length;while(d!==e)f.splice(g,0,d),d=d.parent;return f}function b(a){var b=[],c=a.parent;while(c!=null)b.push(a),a=c,c=c.parent;return b.push(a),b}function c(a,c){if(a===c)return a;var d=b(a),e=b(c),f=d.pop(),g=e.pop(),h=null;while(f===g)h=f,f=d.pop(),g=e.pop();return h}function g(a){a.fixed|=2}function h(a){a!==f&&(a.fixed&=1)}function i(){j(),f.fixed&=1,e=f=null}function j(){f.px+=d3.event.dx,f.py+=d3.event.dy,e.resume()}function k(a,b,c){var d=0,e=0;a.charge=0;if(!a.leaf){var f=a.nodes,g=f.length,h=-1,i;while(++h<g){i=f[h];if(i==null)continue;k(i,b,c),a.charge+=i.charge,d+=i.charge*i.cx,e+=i.charge*i.cy}}if(a.point){a.leaf||(a.point.x+=Math.random()-.5,a.point.y+=Math.random()-.5);var j=b*c[a.point.index];a.charge+=a.pointCharge=j,d+=j*a.point.x,e+=j*a.point.y}a.cx=d/a.charge,a.cy=e/a.charge}function l(a){return 20}function m(a){return 1}function o(a){return a.x}function p(a){return a.y}function q(a,b,c){a.y0=b,a.y=c}function t(a){var b=1,c=0,d=a[0][1],e,f=a.length;for(;b<f;++b)(e=a[b][1])>d&&(c=b,d=e);return c}function u(a){return a.reduce(v,0)}function v(a,b){return a+b[1]}function w(a,b){return x(a,Math.ceil(Math.log(b.length)/Math.LN2+1))}function x(a,b){var c=-1,d=+a[0],e=(a[1]-d)/b,f=[];while(++c<=b)f[c]=e*c+d;return f}function y(a){return[d3.min(a),d3.max(a)]}function z(a,b){return a.sort=d3.rebind(a,b.sort),a.children=d3.rebind(a,b.children),a.links=D,a.value=d3.rebind(a,b.value),a.nodes=function(b){return E=!0,(a.nodes=a)(b)},a}function A(a){return a.children}function B(a){return a.value}function C(a,b){return b.value-a.value}function D(a){return d3.merge(a.map(function(a){return(a.children||[]).map(function(b){return{source:a,target:b}})}))}function F(a,b){return a.value-b.value}function G(a,b){var c=a._pack_next;a._pack_next=b,b._pack_prev=a,b._pack_next=c,c._pack_prev=b}function H(a,b){a._pack_next=b,b._pack_prev=a}function I(a,b){var c=b.x-a.x,d=b.y-a.y,e=a.r+b.r;return e*e-c*c-d*d>.001}function J(a){function l(a){b=Math.min(a.x-a.r,b),c=Math.max(a.x+a.r,c),d=Math.min(a.y-a.r,d),e=Math.max(a.y+a.r,e)}var b=Infinity,c=-Infinity,d=Infinity,e=-Infinity,f=a.length,g,h,i,j,k;a.forEach(K),g=a[0],g.x=-g.r,g.y=0,l(g);if(f>1){h=a[1],h.x=h.r,h.y=0,l(h);if(f>2){i=a[2],O(g,h,i),l(i),G(g,i),g._pack_prev=i,G(i,h),h=g._pack_next;for(var m=3;m<f;m++){O(g,h,i=a[m]);var n=0,o=1,p=1;for(j=h._pack_next;j!==h;j=j._pack_next,o++)if(I(j,i)){n=1;break}if(n==1)for(k=g._pack_prev;k!==j._pack_prev;k=k._pack_prev,p++)if(I(k,i)){p<o&&(n=-1,j=k);break}n==0?(G(g,i),h=i,l(i)):n>0?(H(g,j),h=j,m--):(H(j,h),g=j,m--)}}}var q=(b+c)/2,r=(d+e)/2,s=0;for(var m=0;m<f;m++){var t=a[m];t.x-=q,t.y-=r,s=Math.max(s,t.r+Math.sqrt(t.x*t.x+t.y*t.y))}return a.forEach(L),s}function K(a){a._pack_next=a._pack_prev=a}function L(a){delete a._pack_next,delete a._pack_prev}function M(a){var b=a.children;b&&b.length?(b.forEach(M),a.r=J(b)):a.r=Math.sqrt(a.value)}function N(a,b,c,d){var e=a.children;a.x=b+=d*a.x,a.y=c+=d*a.y,a.r*=d;if(e){var f=-1,g=e.length;while(++f<g)N(e[f],b,c,d)}}function O(a,b,c){var d=a.r+c.r,e=b.x-a.x,f=b.y-a.y;if(d&&(e||f)){var g=b.r+c.r,h=Math.sqrt(e*e+f*f),i=Math.max(-1,Math.min(1,(d*d+h*h-g*g)/(2*d*h))),j=Math.acos(i),k=i*(d/=h),l=Math.sin(j)*d;c.x=a.x+k*e+l*f,c.y=a.y+k*f-l*e}else c.x=a.x+d,c.y=a.y}function P(a){return 1+d3.max(a,function(a){return a.y})}function Q(a){return a.reduce(function(a,b){return a+b.x},0)/a.length}function R(a){var b=a.children;return b&&b.length?R(b[0]):a}function S(a){var b=a.children,c;return b&&(c=b.length)?S(b[c-1]):a}function T(a,b){return a.parent==b.parent?1:2}function U(a){var b=a.children;return b&&b.length?b[0]:a._tree.thread}function V(a){var b=a.children,c;return b&&(c=b.length)?b[c-1]:a._tree.thread}function W(a,b){var c=a.children;if(c&&(e=c.length)){var d,e,f=-1;while(++f<e)b(d=W(c[f],b),a)>0&&(a=d)}return a}function X(a,b){return a.x-b.x}function Y(a,b){return b.x-a.x}function Z(a,b){return a.depth-b.depth}function $(a,b){function c(a,d){var e=a.children;if(e&&(i=e.length)){var f,g=null,h=-1,i;while(++h<i)f=e[h],c(f,g),g=f}b(a,d)}c(a,null)}function _(a){var b=0,c=0,d=a.children,e=d.length,f;while(--e>=0)f=d[e]._tree,f.prelim+=b,f.mod+=b,b+=f.shift+(c+=f.change)}function ba(a,b,c){a=a._tree,b=b._tree;var d=c/(b.number-a.number);a.change+=d,b.change-=d,b.shift+=c,b.prelim+=c,b.mod+=c}function bb(a,b,c){return a._tree.ancestor.parent==b.parent?a._tree.ancestor:c}function bc(a){return{x:a.x,y:a.y,dx:a.dx,dy:a.dy}}function bd(a,b){var c=a.x+b[3],d=a.y+b[0],e=a.dx-b[1]-b[3],f=a.dy-b[0]-b[2];return e<0&&(c+=e/2,e=0),f<0&&(d+=f/2,f=0),{x:c,y:d,dx:e,dy:f}}d3.layout={},d3.layout.bundle=function(){return function(b){var c=[],d=-1,e=b.length;while(++d<e)c.push(a(b[d]));return c}},d3.layout.chord=function(){function j(){var a={},j=[],l=d3.range(e),m=[],n,o,p,q,r;b=[],c=[],n=0,q=-1;while(++q<e){o=0,r=-1;while(++r<e)o+=d[q][r];j.push(o),m.push(d3.range(e)),n+=o}g&&l.sort(function(a,b){return g(j[a],j[b])}),h&&m.forEach(function(a,b){a.sort(function(a,c){return h(d[b][a],d[b][c])})}),n=(2*Math.PI-f*e)/n,o=0,q=-1;while(++q<e){p=o,r=-1;while(++r<e){var s=l[q],t=m[s][r],u=d[s][t],v=o,w=o+=u*n;a[s+"-"+t]={index:s,subindex:t,startAngle:v,endAngle:w,value:u}}c.push({index:s,startAngle:p,endAngle:o,value:(o-p)/n}),o+=f}q=-1;while(++q<e){r=q-1;while(++r<e){var x=a[q+"-"+r],y=a[r+"-"+q];(x.value||y.value)&&b.push(x.value<y.value?{source:y,target:x}:{source:x,target:y})}}i&&k()}function k(){b.sort(function(a,b){return i((a.source.value+a.target.value)/2,(b.source.value+b.target.value)/2)})}var a={},b,c,d,e,f=0,g,h,i;return a.matrix=function(f){return arguments.length?(e=(d=f)&&d.length,b=c=null,a):d},a.padding=function(d){return arguments.length?(f=d,b=c=null,a):f},a.sortGroups=function(d){return arguments.length?(g=d,b=c=null,a):g},a.sortSubgroups=function(c){return arguments.length?(h=c,b=null,a):h},a.sortChords=function(c){return arguments.length?(i=c,b&&k(),a):i},a.chords=function(){return b||j(),b},a.groups=function(){return c||j(),c},a},d3.layout.force=function(){function A(a){return function(b,c,d,e,f){if(b.point!==a){var g=b.cx-a.x,h=b.cy-a.y,i=1/Math.sqrt(g*g+h*h);if((e-c)*i<t){var j=b.charge*i*i;return a.px-=g*j,a.py-=h*j,!0}if(b.point&&isFinite(i)){var j=b.pointCharge*i*i;a.px-=g*j,a.py-=h*j}}return!b.charge}}function B(){var a=v.length,d=w.length,e,f,g,h,i,j,l,m,p;for(f=0;f<d;++f){g=w[f],h=g.source,i=g.target,m=i.x-h.x,p=i.y-h.y;if(j=m*m+p*p)j=n*y[f]*((j=Math.sqrt(j))-x[f])/j,m*=j,p*=j,i.x-=m*(l=h.weight/(i.weight+h.weight)),i.y-=p*l,h.x+=m*(l=1-l),h.y+=p*l}if(l=n*s){m=c[0]/2,p=c[1]/2,f=-1;if(l)while(++f<a)g=v[f],g.x+=(m-g.x)*l,g.y+=(p-g.y)*l}if(r){k(e=d3.geom.quadtree(v),n,z),f=-1;while(++f<a)(g=v[f]).fixed||e.visit(A(g))}f=-1;while(++f<a)g=v[f],g.fixed?(g.x=g.px,g.y=g.py):(g.x-=(g.px-(g.px=g.x))*o,g.y-=(g.py-(g.py=g.y))*o);return b.tick({type:"tick",alpha:n}),(n*=.99)<.005}function C(b){g(f=b),e=a}var a={},b=d3.dispatch("tick"),c=[1,1],d,n,o=.9,p=l,q=m,r=-30,s=.1,t=.8,u,v=[],w=[],x,y,z;return a.on=function(c,d){return b.on(c,d),a},a.nodes=function(b){return arguments.length?(v=b,a):v},a.links=function(b){return arguments.length?(w=b,a):w},a.size=function(b){return arguments.length?(c=b,a):c},a.linkDistance=function(b){return arguments.length?(p=d3.functor(b),a):p},a.distance=a.linkDistance,a.linkStrength=function(b){return arguments.length?(q=d3.functor(b),a):q},a.friction=function(b){return arguments.length?(o=b,a):o},a.charge=function(b){return arguments.length?(r=typeof b=="function"?b:+b,a):r},a.gravity=function(b){return arguments.length?(s=b,a):s},a.theta=function(b){return arguments.length?(t=b,a):t},a.start=function(){function k(a,c){var d=l(b),e=-1,f=d.length,g;while(++e<f)if(!isNaN(g=d[e][a]))return g;return Math.random()*c}function l(){if(!i){i=[];for(d=0;d<e;++d)i[d]=[];for(d=0;d<f;++d){var a=w[d];i[a.source.index].push(a.target),i[a.target.index].push(a.source)}}return i[b]}var b,d,e=v.length,f=w.length,g=c[0],h=c[1],i,j;for(b=0;b<e;++b)(j=v[b]).index=b,j.weight=0;x=[],y=[];for(b=0;b<f;++b)j=w[b],typeof j.source=="number"&&(j.source=v[j.source]),typeof j.target=="number"&&(j.target=v[j.target]),x[b]=p.call(this,j,b),y[b]=q.call(this,j,b),++j.source.weight,++j.target.weight;for(b=0;b<e;++b)j=v[b],isNaN(j.x)&&(j.x=k("x",g)),isNaN(j.y)&&(j.y=k("y",h)),isNaN(j.px)&&(j.px=j.x),isNaN(j.py)&&(j.py=j.y);z=[];if(typeof r=="function")for(b=0;b<e;++b)z[b]=+r.call(this,v[b],b);else for(b=0;b<e;++b)z[b]=r;return a.resume()},a.resume=function(){return n=.1,d3.timer(B),a},a.stop=function(){return n=0,a},a.drag=function(){d||(d=d3.behavior.drag().on("dragstart",C).on("drag",j).on("dragend",i)),this.on("mouseover.force",g).on("mouseout.force",h).call(d)},a};var e,f;d3.layout.partition=function(){function c(a,b,d,e){var f=a.children;a.x=b,a.y=a.depth*e,a.dx=d,a.dy=e;if(f&&(h=f.length)){var g=-1,h,i,j;d=a.value?d/a.value:0;while(++g<h)c(i=f[g],b,j=i.value*d,e),b+=j}}function d(a){var b=a.children,c=0;if(b&&(f=b.length)){var e=-1,f;while(++e<f)c=Math.max(c,d(b[e]))}return 1+c}function e(e,f){var g=a.call(this,e,f);return c(g[0],0,b[0],b[1]/d(g[0])),g}var a=d3.layout.hierarchy(),b=[1,1];return e.size=function(a){return arguments.length?(b=a,e):b},z(e,a)},d3.layout.pie=function(){function f(g,h){var i=g.map(function(b,c){return+a.call(f,b,c)}),j=+(typeof c=="function"?c.apply(this,arguments):c),k=((typeof e=="function"?e.apply(this,arguments):e)-c)/d3.sum(i),l=d3.range(g.length);b!=null&&l.sort(b===n?function(a,b){return i[b]-i[a]}:function(a,c){return b(g[a],g[c])});var m=l.map(function(a){return{data:g[a],value:d=i[a],startAngle:j,endAngle:j+=d*k}});return g.map(function(a,b){return m[l[b]]})}var a=Number,b=n,c=0,e=2*Math.PI;return f.value=function(b){return arguments.length?(a=b,f):a},f.sort=function(a){return arguments.length?(b=a,f):b},f.startAngle=function(a){return arguments.length?(c=a,f):c},f.endAngle=function(a){return arguments.length?(e=a,f):e},f};var n={};d3.layout.stack=function(){function g(h,i){var j=h.map(function(b,c){return a.call(g,b,c)}),k=j.map(function(a,b){return a.map(function(a,b){return[e.call(g,a,b),f.call(g,a,b)]})}),l=b.call(g,k,i);j=d3.permute(j,l),k=d3.permute(k,l);var m=c.call(g,k,i),n=j.length,o=j[0].length,p,q,r;for(q=0;q<o;++q){d.call(g,j[0][q],r=m[q],k[0][q][1]);for(p=1;p<n;++p)d.call(g,j[p][q],r+=k[p-1][q][1],k[p][q][1])}return h}var a=Object,b=r["default"],c=s.zero,d=q,e=o,f=p;return g.values=function(b){return arguments.length?(a=b,g):a},g.order=function(a){return arguments.length?(b=typeof a=="function"?a:r[a],g):b},g.offset=function(a){return arguments.length?(c=typeof a=="function"?a:s[a],g):c},g.x=function(a){return arguments.length?(e=a,g):e},g.y=function(a){return arguments.length?(f=a,g):f},g.out=function(a){return arguments.length?(d=a,g):d},g};var r={"inside-out":function(a){var b=a.length,c,d,e=a.map(t),f=a.map(u),g=d3.range(b).sort(function(a,b){return e[a]-e[b]}),h=0,i=0,j=[],k=[];for(c=0;c<b;++c)d=g[c],h<i?(h+=f[d],j.push(d)):(i+=f[d],k.push(d));return k.reverse().concat(j)},reverse:function(a){return d3.range(a.length).reverse()},"default":function(a){return d3.range(a.length)}},s={silhouette:function(a){var b=a.length,c=a[0].length,d=[],e=0,f,g,h,i=[];for(g=0;g<c;++g){for(f=0,h=0;f<b;f++)h+=a[f][g][1];h>e&&(e=h),d.push(h)}for(g=0;g<c;++g)i[g]=(e-d[g])/2;return i},wiggle:function(a){var b=a.length,c=a[0],d=c.length,e=0,f,g,h,i,j,k,l,m,n,o=[];o[0]=m=n=0;for(g=1;g<d;++g){for(f=0,i=0;f<b;++f)i+=a[f][g][1];for(f=0,j=0,l=c[g][0]-c[g-1][0];f<b;++f){for(h=0,k=(a[f][g][1]-a[f][g-1][1])/(2*l);h<f;++h)k+=(a[h][g][1]-a[h][g-1][1])/l;j+=k*a[f][g][1]}o[g]=m-=i?j/i*l:0,m<n&&(n=m)}for(g=0;g<d;++g)o[g]-=n;return o},expand:function(a){var b=a.length,c=a[0].length,d=1/b,e,f,g,h=[];for(f=0;f<c;++f){for(e=0,g=0;e<b;e++)g+=a[e][f][1];if(g)for(e=0;e<b;e++)a[e][f][1]/=g;else for(e=0;e<b;e++)a[e][f][1]=d}for(f=0;f<c;++f)h[f]=0;return h},zero:function(a){var b=-1,c=a[0].length,d=[];while(++b<c)d[b]=0;return d}};d3.layout.histogram=function(){function e(e,f){var g=[],h=e.map(b,this),i=c.call(this,h,f),j=d.call(this,i,h,f),k,f=-1,l=h.length,m=j.length-1,n=a?1:1/l,o;while(++f<m)k=g[f]=[],k.dx=j[f+1]-(k.x=j[f]),k.y=0;f=-1;while(++f<l)o=h[f],o>=i[0]&&o<=i[1]&&(k=g[d3.bisect(j,o,1,m)-1],k.y+=n,k.push(e[f]));return g}var a=!0,b=Number,c=y,d=w;return e.value=function(a){return arguments.length?(b=a,e):b},e.range=function(a){return arguments.length?(c=d3.functor(a),e):c},e.bins=function(a){return arguments.length?(d=typeof a=="number"?function(b){return x(b,a)}:d3.functor(a),e):d},e.frequency=function(b){return arguments.length?(a=!!b,e):a},e},d3.layout.hierarchy=function(){function e(f,h,i){var j=b.call(g,f,h),k=E?f:{data:f};k.depth=h,i.push(k);if(j&&(m=j.length)){var l=-1,m,n=k.children=[],o=0,p=h+1;while(++l<m)d=e(j[l],p,i),d.parent=k,n.push(d),o+=d.value;a&&n.sort(a),c&&(k.value=o)}else c&&(k.value=+c.call(g,f,h)||0);return k}function f(a,b){var d=a.children,e=0;if(d&&(i=d.length)){var h=-1,i,j=b+1;while(++h<i)e+=f(d[h],j)}else c&&(e=+c.call(g,E?a:a.data,b)||0);return c&&(a.value=e),e}function g(a){var b=[];return e(a,0,b),b}var a=C,b=A,c=B;return g.sort=function(b){return arguments.length?(a=b,g):a},g.children=function(a){return arguments.length?(b=a,g):b},g.value=function(a){return arguments.length?(c=a,g):c},g.revalue=function(a){return f(a,0),a},g};var E=!1;d3.layout.pack=function(){function c(c,d){var e=a.call(this,c,d),f=e[0];f.x=0,f.y=0,M(f);var g=b[0],h=b[1],i=1/Math.max(2*f.r/g,2*f.r/h);return N(f,g/2,h/2,i),e}var a=d3.layout.hierarchy().sort(F),b=[1,1];return c.size=function(a){return arguments.length?(b=a,c):b},z(c,a)},d3.layout.cluster=function(){function d(d,e){var f=a.call(this,d,e),g=f[0],h,i=0,j,k;$(g,function(a){var c=a.children;c&&c.length?(a.x=Q(c),a.y=P(c)):(a.x=h?i+=b(a,h):0,a.y=0,h=a)});var l=R(g),m=S(g),n=l.x-b(l,m)/2,o=m.x+b(m,l)/2;return $(g,function(a){a.x=(a.x-n)/(o-n)*c[0],a.y=(1-a.y/g.y)*c[1]}),f}var a=d3.layout.hierarchy().sort(null).value(null),b=T,c=[1,1];return d.separation=function(a){return arguments.length?(b=a,d):b},d.size=function(a){return arguments.length?(c=a,d):c},z(d,a)},d3.layout.tree=function(){function d(d,e){function h(a,c){var d=a.children,e=a._tree;if(d&&(f=d.length)){var f,g=d[0],i,k=g,l,m=-1;while(++m<f)l=d[m],h(l,i),k=j(l,i,k),i=l;_(a);var n=.5*(g._tree.prelim+l._tree.prelim);c?(e.prelim=c._tree.prelim+b(a,c),e.mod=e.prelim-n):e.prelim=n}else c&&(e.prelim=c._tree.prelim+b(a,c))}function i(a,b){a.x=a._tree.prelim+b;var c=a.children;if(c&&(e=c.length)){var d=-1,e;b+=a._tree.mod;while(++d<e)i(c[d],b)}}function j(a,c,d){if(c){var e=a,f=a,g=c,h=a.parent.children[0],i=e._tree.mod,j=f._tree.mod,k=g._tree.mod,l=h._tree.mod,m;while(g=V(g),e=U(e),g&&e)h=U(h),f=V(f),f._tree.ancestor=a,m=g._tree.prelim+k-e._tree.prelim-i+b(g,e),m>0&&(ba(bb(g,a,d),a,m),i+=m,j+=m),k+=g._tree.mod,i+=e._tree.mod,l+=h._tree.mod,j+=f._tree.mod;g&&!V(f)&&(f._tree.thread=g,f._tree.mod+=k-j),e&&!U(h)&&(h._tree.thread=e,h._tree.mod+=i-l,d=a)}return d}var f=a.call(this,d,e),g=f[0];$(g,function(a,b){a._tree={ancestor:a,prelim:0,mod:0,change:0,shift:0,number:b?b._tree.number+1:0}}),h(g),i(g,-g._tree.prelim);var k=W(g,Y),l=W(g,X),m=W(g,Z),n=k.x-b(k,l)/2,o=l.x+b(l,k)/2,p=m.depth||1;return $(g,function(a){a.x=(a.x-n)/(o-n)*c[0],a.y=a.depth/p*c[1],delete a._tree}),f}var a=d3.layout.hierarchy().sort(null).value(null),b=T,c=[1,1];return d.separation=function(a){return arguments.length?(b=a,d):b},d.size=function(a){return arguments.length?(c=a,d):c},z(d,a)},d3.layout.treemap=function(){function i(a,b){var c=-1,d=a.length,e,f;while(++c<d)f=(e=a[c]).value*(b<0?0:b),e.area=isNaN(f)||f<=0?0:f}function j(a){var b=a.children;if(b&&b.length){var c=e(a),d=[],f=b.slice(),g,h=Infinity,k,n=Math.min(c.dx,c.dy),o;i(f,c.dx*c.dy/a.value),d.area=0;while((o=f.length)>0)d.push(g=f[o-1]),d.area+=g.area,(k=l(d,n))<=h?(f.pop(),h=k):(d.area-=d.pop().area,m(d,n,c,!1),n=Math.min(c.dx,c.dy),d.length=d.area=0,h=Infinity);d.length&&(m(d,n,c,!0),d.length=d.area=0),b.forEach(j)}}function k(a){var b=a.children;if(b&&b.length){var c=e(a),d=b.slice(),f,g=[];i(d,c.dx*c.dy/a.value),g.area=0;while(f=d.pop())g.push(f),g.area+=f.area,f.z!=null&&(m(g,f.z?c.dx:c.dy,c,!d.length),g.length=g.area=0);b.forEach(k)}}function l(a,b){var c=a.area,d,e=0,f=Infinity,g=-1,i=a.length;while(++g<i){if(!(d=a[g].area))continue;d<f&&(f=d),d>e&&(e=d)}return c*=c,b*=b,c?Math.max(b*e*h/c,c/(b*f*h)):Infinity}function m(a,c,d,e){var f=-1,g=a.length,h=d.x,i=d.y,j=c?b(a.area/c):0,k;if(c==d.dx){if(e||j>d.dy)j=j?d.dy:0;while(++f<g)k=a[f],k.x=h,k.y=i,k.dy=j,h+=k.dx=j?b(k.area/j):0;k.z=!0,k.dx+=d.x+d.dx-h,d.y+=j,d.dy-=j}else{if(e||j>d.dx)j=j?d.dx:0;while(++f<g)k=a[f],k.x=h,k.y=i,k.dx=j,i+=k.dy=j?b(k.area/j):0;k.z=!1,k.dy+=d.y+d.dy-i,d.x+=j,d.dx-=j}}function n(b){var d=g||a(b),e=d[0];return e.x=0,e.y=0,e.dx=c[0],e.dy=c[1],g&&a.revalue(e),i([e],e.dx*e.dy/e.value),(g?k:j)(e),f&&(g=d),d}var a=d3.layout.hierarchy(),b=Math.round,c=[1,1],d=null,e=bc,f=!1,g,h=.5*(1+Math.sqrt(5));return n.size=function(a){return arguments.length?(c=a,n):c},n.padding=function(a){function b(b){var c=a.call(n,b,b.depth);return c==null?bc(b):bd(b,typeof c=="number"?[c,c,c,c]:c)}function c(b){return bd(b,a)}if(!arguments.length)return d;var f;return e=(d=a)==null?bc:(f=typeof a)==="function"?b:f==="number"?(a=[a,a,a,a],c):c,n},n.round=function(a){return arguments.length?(b=a?Math.round:Number,n):b!=Number},n.sticky=function(a){return arguments.length?(f=a,g=null,n):f},n.ratio=function(a){return arguments.length?(h=a,n):h},z(n,a)}})();
  (function() {
  function d3_class(ctor, properties) {
  try {
  for (var key in properties) {
  Object.defineProperty(ctor.prototype, key, {
  value: properties[key],
  enumerable: false
  });
  }
  } catch (e) {
  ctor.prototype = properties;
  }
  }
  function d3_arrayCopy(pseudoarray) {
  var i = -1, n = pseudoarray.length, array = [];
  while (++i < n) array.push(pseudoarray[i]);
  return array;
  }
  function d3_arraySlice(pseudoarray) {
  return Array.prototype.slice.call(pseudoarray);
  }
  function d3_Map() {}
  function d3_identity(d) {
  return d;
  }
  function d3_this() {
  return this;
  }
  function d3_true() {
  return true;
  }
  function d3_functor(v) {
  return typeof v === "function" ? v : function() {
  return v;
  };
  }
  function d3_rebind(target, source, method) {
  return function() {
  var value = method.apply(source, arguments);
  return arguments.length ? target : value;
  };
  }
  function d3_number(x) {
  return x != null && !isNaN(x);
  }
  function d3_zipLength(d) {
  return d.length;
  }
  function d3_splitter(d) {
  return d == null;
  }
  function d3_collapse(s) {
  return s.trim().replace(/\s+/g, " ");
  }
  function d3_range_integerScale(x) {
  var k = 1;
  while (x * k % 1) k *= 10;
  return k;
  }
  function d3_dispatch() {}
  function d3_dispatch_event(dispatch) {
  function event() {
  var z = listeners, i = -1, n = z.length, l;
  while (++i < n) if (l = z[i].on) l.apply(this, arguments);
  return dispatch;
  }
  var listeners = [], listenerByName = new d3_Map;
  event.on = function(name, listener) {
  var l = listenerByName.get(name), i;
  if (arguments.length < 2) return l && l.on;
  if (l) {
  l.on = null;
  listeners = listeners.slice(0, i = listeners.indexOf(l)).concat(listeners.slice(i + 1));
  listenerByName.remove(name);
  }
  if (listener) listeners.push(listenerByName.set(name, {
  on: listener
  }));
  return dispatch;
  };
  return event;
  }
  function d3_format_precision(x, p) {
  return p - (x ? 1 + Math.floor(Math.log(x + Math.pow(10, 1 + Math.floor(Math.log(x) / Math.LN10) - p)) / Math.LN10) : 1);
  }
  function d3_format_typeDefault(x) {
  return x + "";
  }
  function d3_format_group(value) {
  var i = value.lastIndexOf("."), f = i >= 0 ? value.substring(i) : (i = value.length, ""), t = [];
  while (i > 0) t.push(value.substring(i -= 3, i + 3));
  return t.reverse().join(",") + f;
  }
  function d3_formatPrefix(d, i) {
  var k = Math.pow(10, Math.abs(8 - i) * 3);
  return {
  scale: i > 8 ? function(d) {
  return d / k;
  } : function(d) {
  return d * k;
  },
  symbol: d
  };
  }
  function d3_ease_clamp(f) {
  return function(t) {
  return t <= 0 ? 0 : t >= 1 ? 1 : f(t);
  };
  }
  function d3_ease_reverse(f) {
  return function(t) {
  return 1 - f(1 - t);
  };
  }
  function d3_ease_reflect(f) {
  return function(t) {
  return .5 * (t < .5 ? f(2 * t) : 2 - f(2 - 2 * t));
  };
  }
  function d3_ease_identity(t) {
  return t;
  }
  function d3_ease_poly(e) {
  return function(t) {
  return Math.pow(t, e);
  };
  }
  function d3_ease_sin(t) {
  return 1 - Math.cos(t * Math.PI / 2);
  }
  function d3_ease_exp(t) {
  return Math.pow(2, 10 * (t - 1));
  }
  function d3_ease_circle(t) {
  return 1 - Math.sqrt(1 - t * t);
  }
  function d3_ease_elastic(a, p) {
  var s;
  if (arguments.length < 2) p = .45;
  if (arguments.length < 1) {
  a = 1;
  s = p / 4;
  } else s = p / (2 * Math.PI) * Math.asin(1 / a);
  return function(t) {
  return 1 + a * Math.pow(2, 10 * -t) * Math.sin((t - s) * 2 * Math.PI / p);
  };
  }
  function d3_ease_back(s) {
  if (!s) s = 1.70158;
  return function(t) {
  return t * t * ((s + 1) * t - s);
  };
  }
  function d3_ease_bounce(t) {
  return t < 1 / 2.75 ? 7.5625 * t * t : t < 2 / 2.75 ? 7.5625 * (t -= 1.5 / 2.75) * t + .75 : t < 2.5 / 2.75 ? 7.5625 * (t -= 2.25 / 2.75) * t + .9375 : 7.5625 * (t -= 2.625 / 2.75) * t + .984375;
  }
  function d3_eventCancel() {
  d3.event.stopPropagation();
  d3.event.preventDefault();
  }
  function d3_eventSource() {
  var e = d3.event, s;
  while (s = e.sourceEvent) e = s;
  return e;
  }
  function d3_eventDispatch(target) {
  var dispatch = new d3_dispatch, i = 0, n = arguments.length;
  while (++i < n) dispatch[arguments[i]] = d3_dispatch_event(dispatch);
  dispatch.of = function(thiz, argumentz) {
  return function(e1) {
  try {
  var e0 = e1.sourceEvent = d3.event;
  e1.target = target;
  d3.event = e1;
  dispatch[e1.type].apply(thiz, argumentz);
  } finally {
  d3.event = e0;
  }
  };
  };
  return dispatch;
  }
  function d3_transform(m) {
  var r0 = [ m.a, m.b ], r1 = [ m.c, m.d ], kx = d3_transformNormalize(r0), kz = d3_transformDot(r0, r1), ky = d3_transformNormalize(d3_transformCombine(r1, r0, -kz)) || 0;
  if (r0[0] * r1[1] < r1[0] * r0[1]) {
  r0[0] *= -1;
  r0[1] *= -1;
  kx *= -1;
  kz *= -1;
  }
  this.rotate = (kx ? Math.atan2(r0[1], r0[0]) : Math.atan2(-r1[0], r1[1])) * d3_transformDegrees;
  this.translate = [ m.e, m.f ];
  this.scale = [ kx, ky ];
  this.skew = ky ? Math.atan2(kz, ky) * d3_transformDegrees : 0;
  }
  function d3_transformDot(a, b) {
  return a[0] * b[0] + a[1] * b[1];
  }
  function d3_transformNormalize(a) {
  var k = Math.sqrt(d3_transformDot(a, a));
  if (k) {
  a[0] /= k;
  a[1] /= k;
  }
  return k;
  }
  function d3_transformCombine(a, b, k) {
  a[0] += k * b[0];
  a[1] += k * b[1];
  return a;
  }
  function d3_interpolateByName(name) {
  return name == "transform" ? d3.interpolateTransform : d3.interpolate;
  }
  function d3_uninterpolateNumber(a, b) {
  b = b - (a = +a) ? 1 / (b - a) : 0;
  return function(x) {
  return (x - a) * b;
  };
  }
  function d3_uninterpolateClamp(a, b) {
  b = b - (a = +a) ? 1 / (b - a) : 0;
  return function(x) {
  return Math.max(0, Math.min(1, (x - a) * b));
  };
  }
  function d3_Color() {}
  function d3_rgb(r, g, b) {
  return new d3_Rgb(r, g, b);
  }
  function d3_Rgb(r, g, b) {
  this.r = r;
  this.g = g;
  this.b = b;
  }
  function d3_rgb_hex(v) {
  return v < 16 ? "0" + Math.max(0, v).toString(16) : Math.min(255, v).toString(16);
  }
  function d3_rgb_parse(format, rgb, hsl) {
  var r = 0, g = 0, b = 0, m1, m2, name;
  m1 = /([a-z]+)\((.*)\)/i.exec(format);
  if (m1) {
  m2 = m1[2].split(",");
  switch (m1[1]) {
  case "hsl":
  {
  return hsl(parseFloat(m2[0]), parseFloat(m2[1]) / 100, parseFloat(m2[2]) / 100);
  }
  case "rgb":
  {
  return rgb(d3_rgb_parseNumber(m2[0]), d3_rgb_parseNumber(m2[1]), d3_rgb_parseNumber(m2[2]));
  }
  }
  }
  if (name = d3_rgb_names.get(format)) return rgb(name.r, name.g, name.b);
  if (format != null && format.charAt(0) === "#") {
  if (format.length === 4) {
  r = format.charAt(1);
  r += r;
  g = format.charAt(2);
  g += g;
  b = format.charAt(3);
  b += b;
  } else if (format.length === 7) {
  r = format.substring(1, 3);
  g = format.substring(3, 5);
  b = format.substring(5, 7);
  }
  r = parseInt(r, 16);
  g = parseInt(g, 16);
  b = parseInt(b, 16);
  }
  return rgb(r, g, b);
  }
  function d3_rgb_hsl(r, g, b) {
  var min = Math.min(r /= 255, g /= 255, b /= 255), max = Math.max(r, g, b), d = max - min, h, s, l = (max + min) / 2;
  if (d) {
  s = l < .5 ? d / (max + min) : d / (2 - max - min);
  if (r == max) h = (g - b) / d + (g < b ? 6 : 0); else if (g == max) h = (b - r) / d + 2; else h = (r - g) / d + 4;
  h *= 60;
  } else {
  s = h = 0;
  }
  return d3_hsl(h, s, l);
  }
  function d3_rgb_lab(r, g, b) {
  r = d3_rgb_xyz(r);
  g = d3_rgb_xyz(g);
  b = d3_rgb_xyz(b);
  var x = d3_xyz_lab((.4124564 * r + .3575761 * g + .1804375 * b) / d3_lab_X), y = d3_xyz_lab((.2126729 * r + .7151522 * g + .072175 * b) / d3_lab_Y), z = d3_xyz_lab((.0193339 * r + .119192 * g + .9503041 * b) / d3_lab_Z);
  return d3_lab(116 * y - 16, 500 * (x - y), 200 * (y - z));
  }
  function d3_rgb_xyz(r) {
  return (r /= 255) <= .04045 ? r / 12.92 : Math.pow((r + .055) / 1.055, 2.4);
  }
  function d3_rgb_parseNumber(c) {
  var f = parseFloat(c);
  return c.charAt(c.length - 1) === "%" ? Math.round(f * 2.55) : f;
  }
  function d3_hsl(h, s, l) {
  return new d3_Hsl(h, s, l);
  }
  function d3_Hsl(h, s, l) {
  this.h = h;
  this.s = s;
  this.l = l;
  }
  function d3_hsl_rgb(h, s, l) {
  function v(h) {
  if (h > 360) h -= 360; else if (h < 0) h += 360;
  if (h < 60) return m1 + (m2 - m1) * h / 60;
  if (h < 180) return m2;
  if (h < 240) return m1 + (m2 - m1) * (240 - h) / 60;
  return m1;
  }
  function vv(h) {
  return Math.round(v(h) * 255);
  }
  var m1, m2;
  h = h % 360;
  if (h < 0) h += 360;
  s = s < 0 ? 0 : s > 1 ? 1 : s;
  l = l < 0 ? 0 : l > 1 ? 1 : l;
  m2 = l <= .5 ? l * (1 + s) : l + s - l * s;
  m1 = 2 * l - m2;
  return d3_rgb(vv(h + 120), vv(h), vv(h - 120));
  }
  function d3_hcl(h, c, l) {
  return new d3_Hcl(h, c, l);
  }
  function d3_Hcl(h, c, l) {
  this.h = h;
  this.c = c;
  this.l = l;
  }
  function d3_hcl_lab(h, c, l) {
  return d3_lab(l, Math.cos(h *= Math.PI / 180) * c, Math.sin(h) * c);
  }
  function d3_lab(l, a, b) {
  return new d3_Lab(l, a, b);
  }
  function d3_Lab(l, a, b) {
  this.l = l;
  this.a = a;
  this.b = b;
  }
  function d3_lab_rgb(l, a, b) {
  var y = (l + 16) / 116, x = y + a / 500, z = y - b / 200;
  x = d3_lab_xyz(x) * d3_lab_X;
  y = d3_lab_xyz(y) * d3_lab_Y;
  z = d3_lab_xyz(z) * d3_lab_Z;
  return d3_rgb(d3_xyz_rgb(3.2404542 * x - 1.5371385 * y - .4985314 * z), d3_xyz_rgb(-.969266 * x + 1.8760108 * y + .041556 * z), d3_xyz_rgb(.0556434 * x - .2040259 * y + 1.0572252 * z));
  }
  function d3_lab_hcl(l, a, b) {
  return d3_hcl(Math.atan2(b, a) / Math.PI * 180, Math.sqrt(a * a + b * b), l);
  }
  function d3_lab_xyz(x) {
  return x > .206893034 ? x * x * x : (x - 4 / 29) / 7.787037;
  }
  function d3_xyz_lab(x) {
  return x > .008856 ? Math.pow(x, 1 / 3) : 7.787037 * x + 4 / 29;
  }
  function d3_xyz_rgb(r) {
  return Math.round(255 * (r <= .00304 ? 12.92 * r : 1.055 * Math.pow(r, 1 / 2.4) - .055));
  }
  function d3_selection(groups) {
  d3_arraySubclass(groups, d3_selectionPrototype);
  return groups;
  }
  function d3_selection_selector(selector) {
  return function() {
  return d3_select(selector, this);
  };
  }
  function d3_selection_selectorAll(selector) {
  return function() {
  return d3_selectAll(selector, this);
  };
  }
  function d3_selection_attr(name, value) {
  function attrNull() {
  this.removeAttribute(name);
  }
  function attrNullNS() {
  this.removeAttributeNS(name.space, name.local);
  }
  function attrConstant() {
  this.setAttribute(name, value);
  }
  function attrConstantNS() {
  this.setAttributeNS(name.space, name.local, value);
  }
  function attrFunction() {
  var x = value.apply(this, arguments);
  if (x == null) this.removeAttribute(name); else this.setAttribute(name, x);
  }
  function attrFunctionNS() {
  var x = value.apply(this, arguments);
  if (x == null) this.removeAttributeNS(name.space, name.local); else this.setAttributeNS(name.space, name.local, x);
  }
  name = d3.ns.qualify(name);
  return value == null ? name.local ? attrNullNS : attrNull : typeof value === "function" ? name.local ? attrFunctionNS : attrFunction : name.local ? attrConstantNS : attrConstant;
  }
  function d3_selection_classedRe(name) {
  return new RegExp("(?:^|\\s+)" + d3.requote(name) + "(?:\\s+|$)", "g");
  }
  function d3_selection_classed(name, value) {
  function classedConstant() {
  var i = -1;
  while (++i < n) name[i](this, value);
  }
  function classedFunction() {
  var i = -1, x = value.apply(this, arguments);
  while (++i < n) name[i](this, x);
  }
  name = name.trim().split(/\s+/).map(d3_selection_classedName);
  var n = name.length;
  return typeof value === "function" ? classedFunction : classedConstant;
  }
  function d3_selection_classedName(name) {
  var re = d3_selection_classedRe(name);
  return function(node, value) {
  if (c = node.classList) return value ? c.add(name) : c.remove(name);
  var c = node.className, cb = c.baseVal != null, cv = cb ? c.baseVal : c;
  if (value) {
  re.lastIndex = 0;
  if (!re.test(cv)) {
  cv = d3_collapse(cv + " " + name);
  if (cb) c.baseVal = cv; else node.className = cv;
  }
  } else if (cv) {
  cv = d3_collapse(cv.replace(re, " "));
  if (cb) c.baseVal = cv; else node.className = cv;
  }
  };
  }
  function d3_selection_style(name, value, priority) {
  function styleNull() {
  this.style.removeProperty(name);
  }
  function styleConstant() {
  this.style.setProperty(name, value, priority);
  }
  function styleFunction() {
  var x = value.apply(this, arguments);
  if (x == null) this.style.removeProperty(name); else this.style.setProperty(name, x, priority);
  }
  return value == null ? styleNull : typeof value === "function" ? styleFunction : styleConstant;
  }
  function d3_selection_property(name, value) {
  function propertyNull() {
  delete this[name];
  }
  function propertyConstant() {
  this[name] = value;
  }
  function propertyFunction() {
  var x = value.apply(this, arguments);
  if (x == null) delete this[name]; else this[name] = x;
  }
  return value == null ? propertyNull : typeof value === "function" ? propertyFunction : propertyConstant;
  }
  function d3_selection_dataNode(data) {
  return {
  __data__: data
  };
  }
  function d3_selection_filter(selector) {
  return function() {
  return d3_selectMatches(this, selector);
  };
  }
  function d3_selection_sortComparator(comparator) {
  if (!arguments.length) comparator = d3.ascending;
  return function(a, b) {
  return comparator(a && a.__data__, b && b.__data__);
  };
  }
  function d3_selection_on(type, listener, capture) {
  function onRemove() {
  var wrapper = this[name];
  if (wrapper) {
  this.removeEventListener(type, wrapper, wrapper.$);
  delete this[name];
  }
  }
  function onAdd() {
  function wrapper(e) {
  var o = d3.event;
  d3.event = e;
  args[0] = node.__data__;
  try {
  listener.apply(node, args);
  } finally {
  d3.event = o;
  }
  }
  var node = this, args = arguments;
  onRemove.call(this);
  this.addEventListener(type, this[name] = wrapper, wrapper.$ = capture);
  wrapper._ = listener;
  }
  var name = "__on" + type, i = type.indexOf(".");
  if (i > 0) type = type.substring(0, i);
  return listener ? onAdd : onRemove;
  }
  function d3_selection_each(groups, callback) {
  for (var j = 0, m = groups.length; j < m; j++) {
  for (var group = groups[j], i = 0, n = group.length, node; i < n; i++) {
  if (node = group[i]) callback(node, i, j);
  }
  }
  return groups;
  }
  function d3_selection_enter(selection) {
  d3_arraySubclass(selection, d3_selection_enterPrototype);
  return selection;
  }
  function d3_transition(groups, id, time) {
  d3_arraySubclass(groups, d3_transitionPrototype);
  var tweens = new d3_Map, event = d3.dispatch("start", "end"), ease = d3_transitionEase;
  groups.id = id;
  groups.time = time;
  groups.tween = function(name, tween) {
  if (arguments.length < 2) return tweens.get(name);
  if (tween == null) tweens.remove(name); else tweens.set(name, tween);
  return groups;
  };
  groups.ease = function(value) {
  if (!arguments.length) return ease;
  ease = typeof value === "function" ? value : d3.ease.apply(d3, arguments);
  return groups;
  };
  groups.each = function(type, listener) {
  if (arguments.length < 2) return d3_transition_each.call(groups, type);
  event.on(type, listener);
  return groups;
  };
  d3.timer(function(elapsed) {
  return d3_selection_each(groups, function(node, i, j) {
  function start(elapsed) {
  if (lock.active > id) return stop();
  lock.active = id;
  tweens.forEach(function(key, value) {
  if (value = value.call(node, d, i)) {
  tweened.push(value);
  }
  });
  event.start.call(node, d, i);
  if (!tick(elapsed)) d3.timer(tick, 0, time);
  return 1;
  }
  function tick(elapsed) {
  if (lock.active !== id) return stop();
  var t = (elapsed - delay) / duration, e = ease(t), n = tweened.length;
  while (n > 0) {
  tweened[--n].call(node, e);
  }
  if (t >= 1) {
  stop();
  d3_transitionId = id;
  event.end.call(node, d, i);
  d3_transitionId = 0;
  return 1;
  }
  }
  function stop() {
  if (!--lock.count) delete node.__transition__;
  return 1;
  }
  var tweened = [], delay = node.delay, duration = node.duration, lock = (node = node.node).__transition__ || (node.__transition__ = {
  active: 0,
  count: 0
  }), d = node.__data__;
  ++lock.count;
  delay <= elapsed ? start(elapsed) : d3.timer(start, delay, time);
  });
  }, 0, time);
  return groups;
  }
  function d3_transition_each(callback) {
  var id = d3_transitionId, ease = d3_transitionEase, delay = d3_transitionDelay, duration = d3_transitionDuration;
  d3_transitionId = this.id;
  d3_transitionEase = this.ease();
  d3_selection_each(this, function(node, i, j) {
  d3_transitionDelay = node.delay;
  d3_transitionDuration = node.duration;
  callback.call(node = node.node, node.__data__, i, j);
  });
  d3_transitionId = id;
  d3_transitionEase = ease;
  d3_transitionDelay = delay;
  d3_transitionDuration = duration;
  return this;
  }
  function d3_tweenNull(d, i, a) {
  return a != "" && d3_tweenRemove;
  }
  function d3_tweenByName(b, name) {
  return d3.tween(b, d3_interpolateByName(name));
  }
  function d3_timer_step() {
  var elapsed, now = Date.now(), t1 = d3_timer_queue;
  while (t1) {
  elapsed = now - t1.then;
  if (elapsed >= t1.delay) t1.flush = t1.callback(elapsed);
  t1 = t1.next;
  }
  var delay = d3_timer_flush() - now;
  if (delay > 24) {
  if (isFinite(delay)) {
  clearTimeout(d3_timer_timeout);
  d3_timer_timeout = setTimeout(d3_timer_step, delay);
  }
  d3_timer_interval = 0;
  } else {
  d3_timer_interval = 1;
  d3_timer_frame(d3_timer_step);
  }
  }
  function d3_timer_flush() {
  var t0 = null, t1 = d3_timer_queue, then = Infinity;
  while (t1) {
  if (t1.flush) {
  delete d3_timer_byId[t1.callback.id];
  t1 = t0 ? t0.next = t1.next : d3_timer_queue = t1.next;
  } else {
  then = Math.min(then, t1.then + t1.delay);
  t1 = (t0 = t1).next;
  }
  }
  return then;
  }
  function d3_mousePoint(container, e) {
  var svg = container.ownerSVGElement || container;
  if (svg.createSVGPoint) {
  var point = svg.createSVGPoint();
  if (d3_mouse_bug44083 < 0 && (window.scrollX || window.scrollY)) {
  svg = d3.select(document.body).append("svg").style("position", "absolute").style("top", 0).style("left", 0);
  var ctm = svg[0][0].getScreenCTM();
  d3_mouse_bug44083 = !(ctm.f || ctm.e);
  svg.remove();
  }
  if (d3_mouse_bug44083) {
  point.x = e.pageX;
  point.y = e.pageY;
  } else {
  point.x = e.clientX;
  point.y = e.clientY;
  }
  point = point.matrixTransform(container.getScreenCTM().inverse());
  return [ point.x, point.y ];
  }
  var rect = container.getBoundingClientRect();
  return [ e.clientX - rect.left - container.clientLeft, e.clientY - rect.top - container.clientTop ];
  }
  function d3_noop() {}
  function d3_scaleExtent(domain) {
  var start = domain[0], stop = domain[domain.length - 1];
  return start < stop ? [ start, stop ] : [ stop, start ];
  }
  function d3_scaleRange(scale) {
  return scale.rangeExtent ? scale.rangeExtent() : d3_scaleExtent(scale.range());
  }
  function d3_scale_nice(domain, nice) {
  var i0 = 0, i1 = domain.length - 1, x0 = domain[i0], x1 = domain[i1], dx;
  if (x1 < x0) {
  dx = i0, i0 = i1, i1 = dx;
  dx = x0, x0 = x1, x1 = dx;
  }
  if (nice = nice(x1 - x0)) {
  domain[i0] = nice.floor(x0);
  domain[i1] = nice.ceil(x1);
  }
  return domain;
  }
  function d3_scale_niceDefault() {
  return Math;
  }
  function d3_scale_linear(domain, range, interpolate, clamp) {
  function rescale() {
  var linear = Math.min(domain.length, range.length) > 2 ? d3_scale_polylinear : d3_scale_bilinear, uninterpolate = clamp ? d3_uninterpolateClamp : d3_uninterpolateNumber;
  output = linear(domain, range, uninterpolate, interpolate);
  input = linear(range, domain, uninterpolate, d3.interpolate);
  return scale;
  }
  function scale(x) {
  return output(x);
  }
  var output, input;
  scale.invert = function(y) {
  return input(y);
  };
  scale.domain = function(x) {
  if (!arguments.length) return domain;
  domain = x.map(Number);
  return rescale();
  };
  scale.range = function(x) {
  if (!arguments.length) return range;
  range = x;
  return rescale();
  };
  scale.rangeRound = function(x) {
  return scale.range(x).interpolate(d3.interpolateRound);
  };
  scale.clamp = function(x) {
  if (!arguments.length) return clamp;
  clamp = x;
  return rescale();
  };
  scale.interpolate = function(x) {
  if (!arguments.length) return interpolate;
  interpolate = x;
  return rescale();
  };
  scale.ticks = function(m) {
  return d3_scale_linearTicks(domain, m);
  };
  scale.tickFormat = function(m) {
  return d3_scale_linearTickFormat(domain, m);
  };
  scale.nice = function() {
  d3_scale_nice(domain, d3_scale_linearNice);
  return rescale();
  };
  scale.copy = function() {
  return d3_scale_linear(domain, range, interpolate, clamp);
  };
  return rescale();
  }
  function d3_scale_linearRebind(scale, linear) {
  return d3.rebind(scale, linear, "range", "rangeRound", "interpolate", "clamp");
  }
  function d3_scale_linearNice(dx) {
  dx = Math.pow(10, Math.round(Math.log(dx) / Math.LN10) - 1);
  return dx && {
  floor: function(x) {
  return Math.floor(x / dx) * dx;
  },
  ceil: function(x) {
  return Math.ceil(x / dx) * dx;
  }
  };
  }
  function d3_scale_linearTickRange(domain, m) {
  var extent = d3_scaleExtent(domain), span = extent[1] - extent[0], step = Math.pow(10, Math.floor(Math.log(span / m) / Math.LN10)), err = m / span * step;
  if (err <= .15) step *= 10; else if (err <= .35) step *= 5; else if (err <= .75) step *= 2;
  extent[0] = Math.ceil(extent[0] / step) * step;
  extent[1] = Math.floor(extent[1] / step) * step + step * .5;
  extent[2] = step;
  return extent;
  }
  function d3_scale_linearTicks(domain, m) {
  return d3.range.apply(d3, d3_scale_linearTickRange(domain, m));
  }
  function d3_scale_linearTickFormat(domain, m) {
  return d3.format(",." + Math.max(0, -Math.floor(Math.log(d3_scale_linearTickRange(domain, m)[2]) / Math.LN10 + .01)) + "f");
  }
  function d3_scale_bilinear(domain, range, uninterpolate, interpolate) {
  var u = uninterpolate(domain[0], domain[1]), i = interpolate(range[0], range[1]);
  return function(x) {
  return i(u(x));
  };
  }
  function d3_scale_polylinear(domain, range, uninterpolate, interpolate) {
  var u = [], i = [], j = 0, k = Math.min(domain.length, range.length) - 1;
  if (domain[k] < domain[0]) {
  domain = domain.slice().reverse();
  range = range.slice().reverse();
  }
  while (++j <= k) {
  u.push(uninterpolate(domain[j - 1], domain[j]));
  i.push(interpolate(range[j - 1], range[j]));
  }
  return function(x) {
  var j = d3.bisect(domain, x, 1, k) - 1;
  return i[j](u[j](x));
  };
  }
  function d3_scale_log(linear, log) {
  function scale(x) {
  return linear(log(x));
  }
  var pow = log.pow;
  scale.invert = function(x) {
  return pow(linear.invert(x));
  };
  scale.domain = function(x) {
  if (!arguments.length) return linear.domain().map(pow);
  log = x[0] < 0 ? d3_scale_logn : d3_scale_logp;
  pow = log.pow;
  linear.domain(x.map(log));
  return scale;
  };
  scale.nice = function() {
  linear.domain(d3_scale_nice(linear.domain(), d3_scale_niceDefault));
  return scale;
  };
  scale.ticks = function() {
  var extent = d3_scaleExtent(linear.domain()), ticks = [];
  if (extent.every(isFinite)) {
  var i = Math.floor(extent[0]), j = Math.ceil(extent[1]), u = pow(extent[0]), v = pow(extent[1]);
  if (log === d3_scale_logn) {
  ticks.push(pow(i));
  for (; i++ < j; ) for (var k = 9; k > 0; k--) ticks.push(pow(i) * k);
  } else {
  for (; i < j; i++) for (var k = 1; k < 10; k++) ticks.push(pow(i) * k);
  ticks.push(pow(i));
  }
  for (i = 0; ticks[i] < u; i++) {}
  for (j = ticks.length; ticks[j - 1] > v; j--) {}
  ticks = ticks.slice(i, j);
  }
  return ticks;
  };
  scale.tickFormat = function(n, format) {
  if (arguments.length < 2) format = d3_scale_logFormat;
  if (arguments.length < 1) return format;
  var k = Math.max(.1, n / scale.ticks().length), f = log === d3_scale_logn ? (e = -1e-12, Math.floor) : (e = 1e-12, Math.ceil), e;
  return function(d) {
  return d / pow(f(log(d) + e)) <= k ? format(d) : "";
  };
  };
  scale.copy = function() {
  return d3_scale_log(linear.copy(), log);
  };
  return d3_scale_linearRebind(scale, linear);
  }
  function d3_scale_logp(x) {
  return Math.log(x < 0 ? 0 : x) / Math.LN10;
  }
  function d3_scale_logn(x) {
  return -Math.log(x > 0 ? 0 : -x) / Math.LN10;
  }
  function d3_scale_pow(linear, exponent) {
  function scale(x) {
  return linear(powp(x));
  }
  var powp = d3_scale_powPow(exponent), powb = d3_scale_powPow(1 / exponent);
  scale.invert = function(x) {
  return powb(linear.invert(x));
  };
  scale.domain = function(x) {
  if (!arguments.length) return linear.domain().map(powb);
  linear.domain(x.map(powp));
  return scale;
  };
  scale.ticks = function(m) {
  return d3_scale_linearTicks(scale.domain(), m);
  };
  scale.tickFormat = function(m) {
  return d3_scale_linearTickFormat(scale.domain(), m);
  };
  scale.nice = function() {
  return scale.domain(d3_scale_nice(scale.domain(), d3_scale_linearNice));
  };
  scale.exponent = function(x) {
  if (!arguments.length) return exponent;
  var domain = scale.domain();
  powp = d3_scale_powPow(exponent = x);
  powb = d3_scale_powPow(1 / exponent);
  return scale.domain(domain);
  };
  scale.copy = function() {
  return d3_scale_pow(linear.copy(), exponent);
  };
  return d3_scale_linearRebind(scale, linear);
  }
  function d3_scale_powPow(e) {
  return function(x) {
  return x < 0 ? -Math.pow(-x, e) : Math.pow(x, e);
  };
  }
  function d3_scale_ordinal(domain, ranger) {
  function scale(x) {
  return range[((index.get(x) || index.set(x, domain.push(x))) - 1) % range.length];
  }
  function steps(start, step) {
  return d3.range(domain.length).map(function(i) {
  return start + step * i;
  });
  }
  var index, range, rangeBand;
  scale.domain = function(x) {
  if (!arguments.length) return domain;
  domain = [];
  index = new d3_Map;
  var i = -1, n = x.length, xi;
  while (++i < n) if (!index.has(xi = x[i])) index.set(xi, domain.push(xi));
  return scale[ranger.t].apply(scale, ranger.a);
  };
  scale.range = function(x) {
  if (!arguments.length) return range;
  range = x;
  rangeBand = 0;
  ranger = {
  t: "range",
  a: arguments
  };
  return scale;
  };
  scale.rangePoints = function(x, padding) {
  if (arguments.length < 2) padding = 0;
  var start = x[0], stop = x[1], step = (stop - start) / (Math.max(1, domain.length - 1) + padding);
  range = steps(domain.length < 2 ? (start + stop) / 2 : start + step * padding / 2, step);
  rangeBand = 0;
  ranger = {
  t: "rangePoints",
  a: arguments
  };
  return scale;
  };
  scale.rangeBands = function(x, padding, outerPadding) {
  if (arguments.length < 2) padding = 0;
  if (arguments.length < 3) outerPadding = padding;
  var reverse = x[1] < x[0], start = x[reverse - 0], stop = x[1 - reverse], step = (stop - start) / (domain.length - padding + 2 * outerPadding);
  range = steps(start + step * outerPadding, step);
  if (reverse) range.reverse();
  rangeBand = step * (1 - padding);
  ranger = {
  t: "rangeBands",
  a: arguments
  };
  return scale;
  };
  scale.rangeRoundBands = function(x, padding, outerPadding) {
  if (arguments.length < 2) padding = 0;
  if (arguments.length < 3) outerPadding = padding;
  var reverse = x[1] < x[0], start = x[reverse - 0], stop = x[1 - reverse], step = Math.floor((stop - start) / (domain.length - padding + 2 * outerPadding)), error = stop - start - (domain.length - padding) * step;
  range = steps(start + Math.round(error / 2), step);
  if (reverse) range.reverse();
  rangeBand = Math.round(step * (1 - padding));
  ranger = {
  t: "rangeRoundBands",
  a: arguments
  };
  return scale;
  };
  scale.rangeBand = function() {
  return rangeBand;
  };
  scale.rangeExtent = function() {
  return d3_scaleExtent(ranger.a[0]);
  };
  scale.copy = function() {
  return d3_scale_ordinal(domain, ranger);
  };
  return scale.domain(domain);
  }
  function d3_scale_quantile(domain, range) {
  function rescale() {
  var k = 0, n = domain.length, q = range.length;
  thresholds = [];
  while (++k < q) thresholds[k - 1] = d3.quantile(domain, k / q);
  return scale;
  }
  function scale(x) {
  if (isNaN(x = +x)) return NaN;
  return range[d3.bisect(thresholds, x)];
  }
  var thresholds;
  scale.domain = function(x) {
  if (!arguments.length) return domain;
  domain = x.filter(function(d) {
  return !isNaN(d);
  }).sort(d3.ascending);
  return rescale();
  };
  scale.range = function(x) {
  if (!arguments.length) return range;
  range = x;
  return rescale();
  };
  scale.quantiles = function() {
  return thresholds;
  };
  scale.copy = function() {
  return d3_scale_quantile(domain, range);
  };
  return rescale();
  }
  function d3_scale_quantize(x0, x1, range) {
  function scale(x) {
  return range[Math.max(0, Math.min(i, Math.floor(kx * (x - x0))))];
  }
  function rescale() {
  kx = range.length / (x1 - x0);
  i = range.length - 1;
  return scale;
  }
  var kx, i;
  scale.domain = function(x) {
  if (!arguments.length) return [ x0, x1 ];
  x0 = +x[0];
  x1 = +x[x.length - 1];
  return rescale();
  };
  scale.range = function(x) {
  if (!arguments.length) return range;
  range = x;
  return rescale();
  };
  scale.copy = function() {
  return d3_scale_quantize(x0, x1, range);
  };
  return rescale();
  }
  function d3_scale_threshold(domain, range) {
  function scale(x) {
  return range[d3.bisect(domain, x)];
  }
  scale.domain = function(_) {
  if (!arguments.length) return domain;
  domain = _;
  return scale;
  };
  scale.range = function(_) {
  if (!arguments.length) return range;
  range = _;
  return scale;
  };
  scale.copy = function() {
  return d3_scale_threshold(domain, range);
  };
  return scale;
  }
  function d3_scale_identity(domain) {
  function identity(x) {
  return +x;
  }
  identity.invert = identity;
  identity.domain = identity.range = function(x) {
  if (!arguments.length) return domain;
  domain = x.map(identity);
  return identity;
  };
  identity.ticks = function(m) {
  return d3_scale_linearTicks(domain, m);
  };
  identity.tickFormat = function(m) {
  return d3_scale_linearTickFormat(domain, m);
  };
  identity.copy = function() {
  return d3_scale_identity(domain);
  };
  return identity;
  }
  function d3_svg_arcInnerRadius(d) {
  return d.innerRadius;
  }
  function d3_svg_arcOuterRadius(d) {
  return d.outerRadius;
  }
  function d3_svg_arcStartAngle(d) {
  return d.startAngle;
  }
  function d3_svg_arcEndAngle(d) {
  return d.endAngle;
  }
  function d3_svg_line(projection) {
  function line(data) {
  function segment() {
  segments.push("M", interpolate(projection(points), tension));
  }
  var segments = [], points = [], i = -1, n = data.length, d, fx = d3_functor(x), fy = d3_functor(y);
  while (++i < n) {
  if (defined.call(this, d = data[i], i)) {
  points.push([ +fx.call(this, d, i), +fy.call(this, d, i) ]);
  } else if (points.length) {
  segment();
  points = [];
  }
  }
  if (points.length) segment();
  return segments.length ? segments.join("") : null;
  }
  var x = d3_svg_lineX, y = d3_svg_lineY, defined = d3_true, interpolate = d3_svg_lineLinear, interpolateKey = interpolate.key, tension = .7;
  line.x = function(_) {
  if (!arguments.length) return x;
  x = _;
  return line;
  };
  line.y = function(_) {
  if (!arguments.length) return y;
  y = _;
  return line;
  };
  line.defined = function(_) {
  if (!arguments.length) return defined;
  defined = _;
  return line;
  };
  line.interpolate = function(_) {
  if (!arguments.length) return interpolateKey;
  if (typeof _ === "function") interpolateKey = interpolate = _; else interpolateKey = (interpolate = d3_svg_lineInterpolators.get(_) || d3_svg_lineLinear).key;
  return line;
  };
  line.tension = function(_) {
  if (!arguments.length) return tension;
  tension = _;
  return line;
  };
  return line;
  }
  function d3_svg_lineX(d) {
  return d[0];
  }
  function d3_svg_lineY(d) {
  return d[1];
  }
  function d3_svg_lineLinear(points) {
  return points.join("L");
  }
  function d3_svg_lineLinearClosed(points) {
  return d3_svg_lineLinear(points) + "Z";
  }
  function d3_svg_lineStepBefore(points) {
  var i = 0, n = points.length, p = points[0], path = [ p[0], ",", p[1] ];
  while (++i < n) path.push("V", (p = points[i])[1], "H", p[0]);
  return path.join("");
  }
  function d3_svg_lineStepAfter(points) {
  var i = 0, n = points.length, p = points[0], path = [ p[0], ",", p[1] ];
  while (++i < n) path.push("H", (p = points[i])[0], "V", p[1]);
  return path.join("");
  }
  function d3_svg_lineCardinalOpen(points, tension) {
  return points.length < 4 ? d3_svg_lineLinear(points) : points[1] + d3_svg_lineHermite(points.slice(1, points.length - 1), d3_svg_lineCardinalTangents(points, tension));
  }
  function d3_svg_lineCardinalClosed(points, tension) {
  return points.length < 3 ? d3_svg_lineLinear(points) : points[0] + d3_svg_lineHermite((points.push(points[0]), points), d3_svg_lineCardinalTangents([ points[points.length - 2] ].concat(points, [ points[1] ]), tension));
  }
  function d3_svg_lineCardinal(points, tension, closed) {
  return points.length < 3 ? d3_svg_lineLinear(points) : points[0] + d3_svg_lineHermite(points, d3_svg_lineCardinalTangents(points, tension));
  }
  function d3_svg_lineHermite(points, tangents) {
  if (tangents.length < 1 || points.length != tangents.length && points.length != tangents.length + 2) {
  return d3_svg_lineLinear(points);
  }
  var quad = points.length != tangents.length, path = "", p0 = points[0], p = points[1], t0 = tangents[0], t = t0, pi = 1;
  if (quad) {
  path += "Q" + (p[0] - t0[0] * 2 / 3) + "," + (p[1] - t0[1] * 2 / 3) + "," + p[0] + "," + p[1];
  p0 = points[1];
  pi = 2;
  }
  if (tangents.length > 1) {
  t = tangents[1];
  p = points[pi];
  pi++;
  path += "C" + (p0[0] + t0[0]) + "," + (p0[1] + t0[1]) + "," + (p[0] - t[0]) + "," + (p[1] - t[1]) + "," + p[0] + "," + p[1];
  for (var i = 2; i < tangents.length; i++, pi++) {
  p = points[pi];
  t = tangents[i];
  path += "S" + (p[0] - t[0]) + "," + (p[1] - t[1]) + "," + p[0] + "," + p[1];
  }
  }
  if (quad) {
  var lp = points[pi];
  path += "Q" + (p[0] + t[0] * 2 / 3) + "," + (p[1] + t[1] * 2 / 3) + "," + lp[0] + "," + lp[1];
  }
  return path;
  }
  function d3_svg_lineCardinalTangents(points, tension) {
  var tangents = [], a = (1 - tension) / 2, p0, p1 = points[0], p2 = points[1], i = 1, n = points.length;
  while (++i < n) {
  p0 = p1;
  p1 = p2;
  p2 = points[i];
  tangents.push([ a * (p2[0] - p0[0]), a * (p2[1] - p0[1]) ]);
  }
  return tangents;
  }
  function d3_svg_lineBasis(points) {
  if (points.length < 3) return d3_svg_lineLinear(points);
  var i = 1, n = points.length, pi = points[0], x0 = pi[0], y0 = pi[1], px = [ x0, x0, x0, (pi = points[1])[0] ], py = [ y0, y0, y0, pi[1] ], path = [ x0, ",", y0 ];
  d3_svg_lineBasisBezier(path, px, py);
  while (++i < n) {
  pi = points[i];
  px.shift();
  px.push(pi[0]);
  py.shift();
  py.push(pi[1]);
  d3_svg_lineBasisBezier(path, px, py);
  }
  i = -1;
  while (++i < 2) {
  px.shift();
  px.push(pi[0]);
  py.shift();
  py.push(pi[1]);
  d3_svg_lineBasisBezier(path, px, py);
  }
  return path.join("");
  }
  function d3_svg_lineBasisOpen(points) {
  if (points.length < 4) return d3_svg_lineLinear(points);
  var path = [], i = -1, n = points.length, pi, px = [ 0 ], py = [ 0 ];
  while (++i < 3) {
  pi = points[i];
  px.push(pi[0]);
  py.push(pi[1]);
  }
  path.push(d3_svg_lineDot4(d3_svg_lineBasisBezier3, px) + "," + d3_svg_lineDot4(d3_svg_lineBasisBezier3, py));
  --i;
  while (++i < n) {
  pi = points[i];
  px.shift();
  px.push(pi[0]);
  py.shift();
  py.push(pi[1]);
  d3_svg_lineBasisBezier(path, px, py);
  }
  return path.join("");
  }
  function d3_svg_lineBasisClosed(points) {
  var path, i = -1, n = points.length, m = n + 4, pi, px = [], py = [];
  while (++i < 4) {
  pi = points[i % n];
  px.push(pi[0]);
  py.push(pi[1]);
  }
  path = [ d3_svg_lineDot4(d3_svg_lineBasisBezier3, px), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier3, py) ];
  --i;
  while (++i < m) {
  pi = points[i % n];
  px.shift();
  px.push(pi[0]);
  py.shift();
  py.push(pi[1]);
  d3_svg_lineBasisBezier(path, px, py);
  }
  return path.join("");
  }
  function d3_svg_lineBundle(points, tension) {
  var n = points.length - 1;
  if (n) {
  var x0 = points[0][0], y0 = points[0][1], dx = points[n][0] - x0, dy = points[n][1] - y0, i = -1, p, t;
  while (++i <= n) {
  p = points[i];
  t = i / n;
  p[0] = tension * p[0] + (1 - tension) * (x0 + t * dx);
  p[1] = tension * p[1] + (1 - tension) * (y0 + t * dy);
  }
  }
  return d3_svg_lineBasis(points);
  }
  function d3_svg_lineDot4(a, b) {
  return a[0] * b[0] + a[1] * b[1] + a[2] * b[2] + a[3] * b[3];
  }
  function d3_svg_lineBasisBezier(path, x, y) {
  path.push("C", d3_svg_lineDot4(d3_svg_lineBasisBezier1, x), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier1, y), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier2, x), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier2, y), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier3, x), ",", d3_svg_lineDot4(d3_svg_lineBasisBezier3, y));
  }
  function d3_svg_lineSlope(p0, p1) {
  return (p1[1] - p0[1]) / (p1[0] - p0[0]);
  }
  function d3_svg_lineFiniteDifferences(points) {
  var i = 0, j = points.length - 1, m = [], p0 = points[0], p1 = points[1], d = m[0] = d3_svg_lineSlope(p0, p1);
  while (++i < j) {
  m[i] = (d + (d = d3_svg_lineSlope(p0 = p1, p1 = points[i + 1]))) / 2;
  }
  m[i] = d;
  return m;
  }
  function d3_svg_lineMonotoneTangents(points) {
  var tangents = [], d, a, b, s, m = d3_svg_lineFiniteDifferences(points), i = -1, j = points.length - 1;
  while (++i < j) {
  d = d3_svg_lineSlope(points[i], points[i + 1]);
  if (Math.abs(d) < 1e-6) {
  m[i] = m[i + 1] = 0;
  } else {
  a = m[i] / d;
  b = m[i + 1] / d;
  s = a * a + b * b;
  if (s > 9) {
  s = d * 3 / Math.sqrt(s);
  m[i] = s * a;
  m[i + 1] = s * b;
  }
  }
  }
  i = -1;
  while (++i <= j) {
  s = (points[Math.min(j, i + 1)][0] - points[Math.max(0, i - 1)][0]) / (6 * (1 + m[i] * m[i]));
  tangents.push([ s || 0, m[i] * s || 0 ]);
  }
  return tangents;
  }
  function d3_svg_lineMonotone(points) {
  return points.length < 3 ? d3_svg_lineLinear(points) : points[0] + d3_svg_lineHermite(points, d3_svg_lineMonotoneTangents(points));
  }
  function d3_svg_lineRadial(points) {
  var point, i = -1, n = points.length, r, a;
  while (++i < n) {
  point = points[i];
  r = point[0];
  a = point[1] + d3_svg_arcOffset;
  point[0] = r * Math.cos(a);
  point[1] = r * Math.sin(a);
  }
  return points;
  }
  function d3_svg_area(projection) {
  function area(data) {
  function segment() {
  segments.push("M", interpolate(projection(points1), tension), L, interpolateReverse(projection(points0.reverse()), tension), "Z");
  }
  var segments = [], points0 = [], points1 = [], i = -1, n = data.length, d, fx0 = d3_functor(x0), fy0 = d3_functor(y0), fx1 = x0 === x1 ? function() {
  return x;
  } : d3_functor(x1), fy1 = y0 === y1 ? function() {
  return y;
  } : d3_functor(y1), x, y;
  while (++i < n) {
  if (defined.call(this, d = data[i], i)) {
  points0.push([ x = +fx0.call(this, d, i), y = +fy0.call(this, d, i) ]);
  points1.push([ +fx1.call(this, d, i), +fy1.call(this, d, i) ]);
  } else if (points0.length) {
  segment();
  points0 = [];
  points1 = [];
  }
  }
  if (points0.length) segment();
  return segments.length ? segments.join("") : null;
  }
  var x0 = d3_svg_lineX, x1 = d3_svg_lineX, y0 = 0, y1 = d3_svg_lineY, defined = d3_true, interpolate = d3_svg_lineLinear, interpolateKey = interpolate.key, interpolateReverse = interpolate, L = "L", tension = .7;
  area.x = function(_) {
  if (!arguments.length) return x1;
  x0 = x1 = _;
  return area;
  };
  area.x0 = function(_) {
  if (!arguments.length) return x0;
  x0 = _;
  return area;
  };
  area.x1 = function(_) {
  if (!arguments.length) return x1;
  x1 = _;
  return area;
  };
  area.y = function(_) {
  if (!arguments.length) return y1;
  y0 = y1 = _;
  return area;
  };
  area.y0 = function(_) {
  if (!arguments.length) return y0;
  y0 = _;
  return area;
  };
  area.y1 = function(_) {
  if (!arguments.length) return y1;
  y1 = _;
  return area;
  };
  area.defined = function(_) {
  if (!arguments.length) return defined;
  defined = _;
  return area;
  };
  area.interpolate = function(_) {
  if (!arguments.length) return interpolateKey;
  if (typeof _ === "function") interpolateKey = interpolate = _; else interpolateKey = (interpolate = d3_svg_lineInterpolators.get(_) || d3_svg_lineLinear).key;
  interpolateReverse = interpolate.reverse || interpolate;
  L = interpolate.closed ? "M" : "L";
  return area;
  };
  area.tension = function(_) {
  if (!arguments.length) return tension;
  tension = _;
  return area;
  };
  return area;
  }
  function d3_svg_chordSource(d) {
  return d.source;
  }
  function d3_svg_chordTarget(d) {
  return d.target;
  }
  function d3_svg_chordRadius(d) {
  return d.radius;
  }
  function d3_svg_chordStartAngle(d) {
  return d.startAngle;
  }
  function d3_svg_chordEndAngle(d) {
  return d.endAngle;
  }
  function d3_svg_diagonalProjection(d) {
  return [ d.x, d.y ];
  }
  function d3_svg_diagonalRadialProjection(projection) {
  return function() {
  var d = projection.apply(this, arguments), r = d[0], a = d[1] + d3_svg_arcOffset;
  return [ r * Math.cos(a), r * Math.sin(a) ];
  };
  }
  function d3_svg_symbolSize() {
  return 64;
  }
  function d3_svg_symbolType() {
  return "circle";
  }
  function d3_svg_symbolCircle(size) {
  var r = Math.sqrt(size / Math.PI);
  return "M0," + r + "A" + r + "," + r + " 0 1,1 0," + -r + "A" + r + "," + r + " 0 1,1 0," + r + "Z";
  }
  function d3_svg_axisX(selection, x) {
  selection.attr("transform", function(d) {
  return "translate(" + x(d) + ",0)";
  });
  }
  function d3_svg_axisY(selection, y) {
  selection.attr("transform", function(d) {
  return "translate(0," + y(d) + ")";
  });
  }
  function d3_svg_axisSubdivide(scale, ticks, m) {
  subticks = [];
  if (m && ticks.length > 1) {
  var extent = d3_scaleExtent(scale.domain()), subticks, i = -1, n = ticks.length, d = (ticks[1] - ticks[0]) / ++m, j, v;
  while (++i < n) {
  for (j = m; --j > 0; ) {
  if ((v = +ticks[i] - j * d) >= extent[0]) {
  subticks.push(v);
  }
  }
  }
  for (--i, j = 0; ++j < m && (v = +ticks[i] + j * d) < extent[1]; ) {
  subticks.push(v);
  }
  }
  return subticks;
  }
  function d3_behavior_zoomDelta() {
  if (!d3_behavior_zoomDiv) {
  d3_behavior_zoomDiv = d3.select("body").append("div").style("visibility", "hidden").style("top", 0).style("height", 0).style("width", 0).style("overflow-y", "scroll").append("div").style("height", "2000px").node().parentNode;
  }
  var e = d3.event, delta;
  try {
  d3_behavior_zoomDiv.scrollTop = 1e3;
  d3_behavior_zoomDiv.dispatchEvent(e);
  delta = 1e3 - d3_behavior_zoomDiv.scrollTop;
  } catch (error) {
  delta = e.wheelDelta || -e.detail * 5;
  }
  return delta;
  }
  function d3_layout_bundlePath(link) {
  var start = link.source, end = link.target, lca = d3_layout_bundleLeastCommonAncestor(start, end), points = [ start ];
  while (start !== lca) {
  start = start.parent;
  points.push(start);
  }
  var k = points.length;
  while (end !== lca) {
  points.splice(k, 0, end);
  end = end.parent;
  }
  return points;
  }
  function d3_layout_bundleAncestors(node) {
  var ancestors = [], parent = node.parent;
  while (parent != null) {
  ancestors.push(node);
  node = parent;
  parent = parent.parent;
  }
  ancestors.push(node);
  return ancestors;
  }
  function d3_layout_bundleLeastCommonAncestor(a, b) {
  if (a === b) return a;
  var aNodes = d3_layout_bundleAncestors(a), bNodes = d3_layout_bundleAncestors(b), aNode = aNodes.pop(), bNode = bNodes.pop(), sharedNode = null;
  while (aNode === bNode) {
  sharedNode = aNode;
  aNode = aNodes.pop();
  bNode = bNodes.pop();
  }
  return sharedNode;
  }
  function d3_layout_forceDragstart(d) {
  d.fixed |= 2;
  }
  function d3_layout_forceDragend(d) {
  d.fixed &= 1;
  }
  function d3_layout_forceMouseover(d) {
  d.fixed |= 4;
  }
  function d3_layout_forceMouseout(d) {
  d.fixed &= 3;
  }
  function d3_layout_forceAccumulate(quad, alpha, charges) {
  var cx = 0, cy = 0;
  quad.charge = 0;
  if (!quad.leaf) {
  var nodes = quad.nodes, n = nodes.length, i = -1, c;
  while (++i < n) {
  c = nodes[i];
  if (c == null) continue;
  d3_layout_forceAccumulate(c, alpha, charges);
  quad.charge += c.charge;
  cx += c.charge * c.cx;
  cy += c.charge * c.cy;
  }
  }
  if (quad.point) {
  if (!quad.leaf) {
  quad.point.x += Math.random() - .5;
  quad.point.y += Math.random() - .5;
  }
  var k = alpha * charges[quad.point.index];
  quad.charge += quad.pointCharge = k;
  cx += k * quad.point.x;
  cy += k * quad.point.y;
  }
  quad.cx = cx / quad.charge;
  quad.cy = cy / quad.charge;
  }
  function d3_layout_forceLinkDistance(link) {
  return 20;
  }
  function d3_layout_forceLinkStrength(link) {
  return 1;
  }
  function d3_layout_stackX(d) {
  return d.x;
  }
  function d3_layout_stackY(d) {
  return d.y;
  }
  function d3_layout_stackOut(d, y0, y) {
  d.y0 = y0;
  d.y = y;
  }
  function d3_layout_stackOrderDefault(data) {
  return d3.range(data.length);
  }
  function d3_layout_stackOffsetZero(data) {
  var j = -1, m = data[0].length, y0 = [];
  while (++j < m) y0[j] = 0;
  return y0;
  }
  function d3_layout_stackMaxIndex(array) {
  var i = 1, j = 0, v = array[0][1], k, n = array.length;
  for (; i < n; ++i) {
  if ((k = array[i][1]) > v) {
  j = i;
  v = k;
  }
  }
  return j;
  }
  function d3_layout_stackReduceSum(d) {
  return d.reduce(d3_layout_stackSum, 0);
  }
  function d3_layout_stackSum(p, d) {
  return p + d[1];
  }
  function d3_layout_histogramBinSturges(range, values) {
  return d3_layout_histogramBinFixed(range, Math.ceil(Math.log(values.length) / Math.LN2 + 1));
  }
  function d3_layout_histogramBinFixed(range, n) {
  var x = -1, b = +range[0], m = (range[1] - b) / n, f = [];
  while (++x <= n) f[x] = m * x + b;
  return f;
  }
  function d3_layout_histogramRange(values) {
  return [ d3.min(values), d3.max(values) ];
  }
  function d3_layout_hierarchyRebind(object, hierarchy) {
  d3.rebind(object, hierarchy, "sort", "children", "value");
  object.links = d3_layout_hierarchyLinks;
  object.nodes = function(d) {
  d3_layout_hierarchyInline = true;
  return (object.nodes = object)(d);
  };
  return object;
  }
  function d3_layout_hierarchyChildren(d) {
  return d.children;
  }
  function d3_layout_hierarchyValue(d) {
  return d.value;
  }
  function d3_layout_hierarchySort(a, b) {
  return b.value - a.value;
  }
  function d3_layout_hierarchyLinks(nodes) {
  return d3.merge(nodes.map(function(parent) {
  return (parent.children || []).map(function(child) {
  return {
  source: parent,
  target: child
  };
  });
  }));
  }
  function d3_layout_packSort(a, b) {
  return a.value - b.value;
  }
  function d3_layout_packInsert(a, b) {
  var c = a._pack_next;
  a._pack_next = b;
  b._pack_prev = a;
  b._pack_next = c;
  c._pack_prev = b;
  }
  function d3_layout_packSplice(a, b) {
  a._pack_next = b;
  b._pack_prev = a;
  }
  function d3_layout_packIntersects(a, b) {
  var dx = b.x - a.x, dy = b.y - a.y, dr = a.r + b.r;
  return dr * dr - dx * dx - dy * dy > .001;
  }
  function d3_layout_packSiblings(node) {
  function bound(node) {
  xMin = Math.min(node.x - node.r, xMin);
  xMax = Math.max(node.x + node.r, xMax);
  yMin = Math.min(node.y - node.r, yMin);
  yMax = Math.max(node.y + node.r, yMax);
  }
  if (!(nodes = node.children) || !(n = nodes.length)) return;
  var nodes, xMin = Infinity, xMax = -Infinity, yMin = Infinity, yMax = -Infinity, a, b, c, i, j, k, n;
  nodes.forEach(d3_layout_packLink);
  a = nodes[0];
  a.x = -a.r;
  a.y = 0;
  bound(a);
  if (n > 1) {
  b = nodes[1];
  b.x = b.r;
  b.y = 0;
  bound(b);
  if (n > 2) {
  c = nodes[2];
  d3_layout_packPlace(a, b, c);
  bound(c);
  d3_layout_packInsert(a, c);
  a._pack_prev = c;
  d3_layout_packInsert(c, b);
  b = a._pack_next;
  for (i = 3; i < n; i++) {
  d3_layout_packPlace(a, b, c = nodes[i]);
  var isect = 0, s1 = 1, s2 = 1;
  for (j = b._pack_next; j !== b; j = j._pack_next, s1++) {
  if (d3_layout_packIntersects(j, c)) {
  isect = 1;
  break;
  }
  }
  if (isect == 1) {
  for (k = a._pack_prev; k !== j._pack_prev; k = k._pack_prev, s2++) {
  if (d3_layout_packIntersects(k, c)) {
  break;
  }
  }
  }
  if (isect) {
  if (s1 < s2 || s1 == s2 && b.r < a.r) d3_layout_packSplice(a, b = j); else d3_layout_packSplice(a = k, b);
  i--;
  } else {
  d3_layout_packInsert(a, c);
  b = c;
  bound(c);
  }
  }
  }
  }
  var cx = (xMin + xMax) / 2, cy = (yMin + yMax) / 2, cr = 0;
  for (i = 0; i < n; i++) {
  c = nodes[i];
  c.x -= cx;
  c.y -= cy;
  cr = Math.max(cr, c.r + Math.sqrt(c.x * c.x + c.y * c.y));
  }
  node.r = cr;
  nodes.forEach(d3_layout_packUnlink);
  }
  function d3_layout_packLink(node) {
  node._pack_next = node._pack_prev = node;
  }
  function d3_layout_packUnlink(node) {
  delete node._pack_next;
  delete node._pack_prev;
  }
  function d3_layout_packTransform(node, x, y, k) {
  var children = node.children;
  node.x = x += k * node.x;
  node.y = y += k * node.y;
  node.r *= k;
  if (children) {
  var i = -1, n = children.length;
  while (++i < n) d3_layout_packTransform(children[i], x, y, k);
  }
  }
  function d3_layout_packPlace(a, b, c) {
  var db = a.r + c.r, dx = b.x - a.x, dy = b.y - a.y;
  if (db && (dx || dy)) {
  var da = b.r + c.r, dc = dx * dx + dy * dy;
  da *= da;
  db *= db;
  var x = .5 + (db - da) / (2 * dc), y = Math.sqrt(Math.max(0, 2 * da * (db + dc) - (db -= dc) * db - da * da)) / (2 * dc);
  c.x = a.x + x * dx + y * dy;
  c.y = a.y + x * dy - y * dx;
  } else {
  c.x = a.x + db;
  c.y = a.y;
  }
  }
  function d3_layout_clusterY(children) {
  return 1 + d3.max(children, function(child) {
  return child.y;
  });
  }
  function d3_layout_clusterX(children) {
  return children.reduce(function(x, child) {
  return x + child.x;
  }, 0) / children.length;
  }
  function d3_layout_clusterLeft(node) {
  var children = node.children;
  return children && children.length ? d3_layout_clusterLeft(children[0]) : node;
  }
  function d3_layout_clusterRight(node) {
  var children = node.children, n;
  return children && (n = children.length) ? d3_layout_clusterRight(children[n - 1]) : node;
  }
  function d3_layout_treeSeparation(a, b) {
  return a.parent == b.parent ? 1 : 2;
  }
  function d3_layout_treeLeft(node) {
  var children = node.children;
  return children && children.length ? children[0] : node._tree.thread;
  }
  function d3_layout_treeRight(node) {
  var children = node.children, n;
  return children && (n = children.length) ? children[n - 1] : node._tree.thread;
  }
  function d3_layout_treeSearch(node, compare) {
  var children = node.children;
  if (children && (n = children.length)) {
  var child, n, i = -1;
  while (++i < n) {
  if (compare(child = d3_layout_treeSearch(children[i], compare), node) > 0) {
  node = child;
  }
  }
  }
  return node;
  }
  function d3_layout_treeRightmost(a, b) {
  return a.x - b.x;
  }
  function d3_layout_treeLeftmost(a, b) {
  return b.x - a.x;
  }
  function d3_layout_treeDeepest(a, b) {
  return a.depth - b.depth;
  }
  function d3_layout_treeVisitAfter(node, callback) {
  function visit(node, previousSibling) {
  var children = node.children;
  if (children && (n = children.length)) {
  var child, previousChild = null, i = -1, n;
  while (++i < n) {
  child = children[i];
  visit(child, previousChild);
  previousChild = child;
  }
  }
  callback(node, previousSibling);
  }
  visit(node, null);
  }
  function d3_layout_treeShift(node) {
  var shift = 0, change = 0, children = node.children, i = children.length, child;
  while (--i >= 0) {
  child = children[i]._tree;
  child.prelim += shift;
  child.mod += shift;
  shift += child.shift + (change += child.change);
  }
  }
  function d3_layout_treeMove(ancestor, node, shift) {
  ancestor = ancestor._tree;
  node = node._tree;
  var change = shift / (node.number - ancestor.number);
  ancestor.change += change;
  node.change -= change;
  node.shift += shift;
  node.prelim += shift;
  node.mod += shift;
  }
  function d3_layout_treeAncestor(vim, node, ancestor) {
  return vim._tree.ancestor.parent == node.parent ? vim._tree.ancestor : ancestor;
  }
  function d3_layout_treemapPadNull(node) {
  return {
  x: node.x,
  y: node.y,
  dx: node.dx,
  dy: node.dy
  };
  }
  function d3_layout_treemapPad(node, padding) {
  var x = node.x + padding[3], y = node.y + padding[0], dx = node.dx - padding[1] - padding[3], dy = node.dy - padding[0] - padding[2];
  if (dx < 0) {
  x += dx / 2;
  dx = 0;
  }
  if (dy < 0) {
  y += dy / 2;
  dy = 0;
  }
  return {
  x: x,
  y: y,
  dx: dx,
  dy: dy
  };
  }
  function d3_dsv(delimiter, mimeType) {
  function dsv(url, callback) {
  d3.text(url, mimeType, function(text) {
  callback(text && dsv.parse(text));
  });
  }
  function formatRow(row) {
  return row.map(formatValue).join(delimiter);
  }
  function formatValue(text) {
  return reFormat.test(text) ? '"' + text.replace(/\"/g, '""') + '"' : text;
  }
  var reParse = new RegExp("\r\n|[" + delimiter + "\r\n]", "g"), reFormat = new RegExp('["' + delimiter + "\n]"), delimiterCode = delimiter.charCodeAt(0);
  dsv.parse = function(text) {
  var header;
  return dsv.parseRows(text, function(row, i) {
  if (i) {
  var o = {}, j = -1, m = header.length;
  while (++j < m) o[header[j]] = row[j];
  return o;
  } else {
  header = row;
  return null;
  }
  });
  };
  dsv.parseRows = function(text, f) {
  function token() {
  if (reParse.lastIndex >= text.length) return EOF;
  if (eol) {
  eol = false;
  return EOL;
  }
  var j = reParse.lastIndex;
  if (text.charCodeAt(j) === 34) {
  var i = j;
  while (i++ < text.length) {
  if (text.charCodeAt(i) === 34) {
  if (text.charCodeAt(i + 1) !== 34) break;
  i++;
  }
  }
  reParse.lastIndex = i + 2;
  var c = text.charCodeAt(i + 1);
  if (c === 13) {
  eol = true;
  if (text.charCodeAt(i + 2) === 10) reParse.lastIndex++;
  } else if (c === 10) {
  eol = true;
  }
  return text.substring(j + 1, i).replace(/""/g, '"');
  }
  var m = reParse.exec(text);
  if (m) {
  eol = m[0].charCodeAt(0) !== delimiterCode;
  return text.substring(j, m.index);
  }
  reParse.lastIndex = text.length;
  return text.substring(j);
  }
  var EOL = {}, EOF = {}, rows = [], n = 0, t, eol;
  reParse.lastIndex = 0;
  while ((t = token()) !== EOF) {
  var a = [];
  while (t !== EOL && t !== EOF) {
  a.push(t);
  t = token();
  }
  if (f && !(a = f(a, n++))) continue;
  rows.push(a);
  }
  return rows;
  };
  dsv.format = function(rows) {
  return rows.map(formatRow).join("\n");
  };
  return dsv;
  }
  function d3_geo_type(types, defaultValue) {
  return function(object) {
  return object && types.hasOwnProperty(object.type) ? types[object.type](object) : defaultValue;
  };
  }
  function d3_path_circle(radius) {
  return "m0," + radius + "a" + radius + "," + radius + " 0 1,1 0," + -2 * radius + "a" + radius + "," + radius + " 0 1,1 0," + +2 * radius + "z";
  }
  function d3_geo_bounds(o, f) {
  if (d3_geo_boundsTypes.hasOwnProperty(o.type)) d3_geo_boundsTypes[o.type](o, f);
  }
  function d3_geo_boundsFeature(o, f) {
  d3_geo_bounds(o.geometry, f);
  }
  function d3_geo_boundsFeatureCollection(o, f) {
  for (var a = o.features, i = 0, n = a.length; i < n; i++) {
  d3_geo_bounds(a[i].geometry, f);
  }
  }
  function d3_geo_boundsGeometryCollection(o, f) {
  for (var a = o.geometries, i = 0, n = a.length; i < n; i++) {
  d3_geo_bounds(a[i], f);
  }
  }
  function d3_geo_boundsLineString(o, f) {
  for (var a = o.coordinates, i = 0, n = a.length; i < n; i++) {
  f.apply(null, a[i]);
  }
  }
  function d3_geo_boundsMultiLineString(o, f) {
  for (var a = o.coordinates, i = 0, n = a.length; i < n; i++) {
  for (var b = a[i], j = 0, m = b.length; j < m; j++) {
  f.apply(null, b[j]);
  }
  }
  }
  function d3_geo_boundsMultiPolygon(o, f) {
  for (var a = o.coordinates, i = 0, n = a.length; i < n; i++) {
  for (var b = a[i][0], j = 0, m = b.length; j < m; j++) {
  f.apply(null, b[j]);
  }
  }
  }
  function d3_geo_boundsPoint(o, f) {
  f.apply(null, o.coordinates);
  }
  function d3_geo_boundsPolygon(o, f) {
  for (var a = o.coordinates[0], i = 0, n = a.length; i < n; i++) {
  f.apply(null, a[i]);
  }
  }
  function d3_geo_greatArcSource(d) {
  return d.source;
  }
  function d3_geo_greatArcTarget(d) {
  return d.target;
  }
  function d3_geo_greatArcInterpolator() {
  function interpolate(t) {
  var B = Math.sin(t *= d) * k, A = Math.sin(d - t) * k, x = A * kx0 + B * kx1, y = A * ky0 + B * ky1, z = A * sy0 + B * sy1;
  return [ Math.atan2(y, x) / d3_geo_radians, Math.atan2(z, Math.sqrt(x * x + y * y)) / d3_geo_radians ];
  }
  var x0, y0, cy0, sy0, kx0, ky0, x1, y1, cy1, sy1, kx1, ky1, d, k;
  interpolate.distance = function() {
  if (d == null) k = 1 / Math.sin(d = Math.acos(Math.max(-1, Math.min(1, sy0 * sy1 + cy0 * cy1 * Math.cos(x1 - x0)))));
  return d;
  };
  interpolate.source = function(_) {
  var cx0 = Math.cos(x0 = _[0] * d3_geo_radians), sx0 = Math.sin(x0);
  cy0 = Math.cos(y0 = _[1] * d3_geo_radians);
  sy0 = Math.sin(y0);
  kx0 = cy0 * cx0;
  ky0 = cy0 * sx0;
  d = null;
  return interpolate;
  };
  interpolate.target = function(_) {
  var cx1 = Math.cos(x1 = _[0] * d3_geo_radians), sx1 = Math.sin(x1);
  cy1 = Math.cos(y1 = _[1] * d3_geo_radians);
  sy1 = Math.sin(y1);
  kx1 = cy1 * cx1;
  ky1 = cy1 * sx1;
  d = null;
  return interpolate;
  };
  return interpolate;
  }
  function d3_geo_greatArcInterpolate(a, b) {
  var i = d3_geo_greatArcInterpolator().source(a).target(b);
  i.distance();
  return i;
  }
  function d3_geom_contourStart(grid) {
  var x = 0, y = 0;
  while (true) {
  if (grid(x, y)) {
  return [ x, y ];
  }
  if (x === 0) {
  x = y + 1;
  y = 0;
  } else {
  x = x - 1;
  y = y + 1;
  }
  }
  }
  function d3_geom_hullCCW(i1, i2, i3, v) {
  var t, a, b, c, d, e, f;
  t = v[i1];
  a = t[0];
  b = t[1];
  t = v[i2];
  c = t[0];
  d = t[1];
  t = v[i3];
  e = t[0];
  f = t[1];
  return (f - b) * (c - a) - (d - b) * (e - a) > 0;
  }
  function d3_geom_polygonInside(p, a, b) {
  return (b[0] - a[0]) * (p[1] - a[1]) < (b[1] - a[1]) * (p[0] - a[0]);
  }
  function d3_geom_polygonIntersect(c, d, a, b) {
  var x1 = c[0], x2 = d[0], x3 = a[0], x4 = b[0], y1 = c[1], y2 = d[1], y3 = a[1], y4 = b[1], x13 = x1 - x3, x21 = x2 - x1, x43 = x4 - x3, y13 = y1 - y3, y21 = y2 - y1, y43 = y4 - y3, ua = (x43 * y13 - y43 * x13) / (y43 * x21 - x43 * y21);
  return [ x1 + ua * x21, y1 + ua * y21 ];
  }
  function d3_voronoi_tessellate(vertices, callback) {
  var Sites = {
  list: vertices.map(function(v, i) {
  return {
  index: i,
  x: v[0],
  y: v[1]
  };
  }).sort(function(a, b) {
  return a.y < b.y ? -1 : a.y > b.y ? 1 : a.x < b.x ? -1 : a.x > b.x ? 1 : 0;
  }),
  bottomSite: null
  };
  var EdgeList = {
  list: [],
  leftEnd: null,
  rightEnd: null,
  init: function() {
  EdgeList.leftEnd = EdgeList.createHalfEdge(null, "l");
  EdgeList.rightEnd = EdgeList.createHalfEdge(null, "l");
  EdgeList.leftEnd.r = EdgeList.rightEnd;
  EdgeList.rightEnd.l = EdgeList.leftEnd;
  EdgeList.list.unshift(EdgeList.leftEnd, EdgeList.rightEnd);
  },
  createHalfEdge: function(edge, side) {
  return {
  edge: edge,
  side: side,
  vertex: null,
  l: null,
  r: null
  };
  },
  insert: function(lb, he) {
  he.l = lb;
  he.r = lb.r;
  lb.r.l = he;
  lb.r = he;
  },
  leftBound: function(p) {
  var he = EdgeList.leftEnd;
  do {
  he = he.r;
  } while (he != EdgeList.rightEnd && Geom.rightOf(he, p));
  he = he.l;
  return he;
  },
  del: function(he) {
  he.l.r = he.r;
  he.r.l = he.l;
  he.edge = null;
  },
  right: function(he) {
  return he.r;
  },
  left: function(he) {
  return he.l;
  },
  leftRegion: function(he) {
  return he.edge == null ? Sites.bottomSite : he.edge.region[he.side];
  },
  rightRegion: function(he) {
  return he.edge == null ? Sites.bottomSite : he.edge.region[d3_voronoi_opposite[he.side]];
  }
  };
  var Geom = {
  bisect: function(s1, s2) {
  var newEdge = {
  region: {
  l: s1,
  r: s2
  },
  ep: {
  l: null,
  r: null
  }
  };
  var dx = s2.x - s1.x, dy = s2.y - s1.y, adx = dx > 0 ? dx : -dx, ady = dy > 0 ? dy : -dy;
  newEdge.c = s1.x * dx + s1.y * dy + (dx * dx + dy * dy) * .5;
  if (adx > ady) {
  newEdge.a = 1;
  newEdge.b = dy / dx;
  newEdge.c /= dx;
  } else {
  newEdge.b = 1;
  newEdge.a = dx / dy;
  newEdge.c /= dy;
  }
  return newEdge;
  },
  intersect: function(el1, el2) {
  var e1 = el1.edge, e2 = el2.edge;
  if (!e1 || !e2 || e1.region.r == e2.region.r) {
  return null;
  }
  var d = e1.a * e2.b - e1.b * e2.a;
  if (Math.abs(d) < 1e-10) {
  return null;
  }
  var xint = (e1.c * e2.b - e2.c * e1.b) / d, yint = (e2.c * e1.a - e1.c * e2.a) / d, e1r = e1.region.r, e2r = e2.region.r, el, e;
  if (e1r.y < e2r.y || e1r.y == e2r.y && e1r.x < e2r.x) {
  el = el1;
  e = e1;
  } else {
  el = el2;
  e = e2;
  }
  var rightOfSite = xint >= e.region.r.x;
  if (rightOfSite && el.side === "l" || !rightOfSite && el.side === "r") {
  return null;
  }
  return {
  x: xint,
  y: yint
  };
  },
  rightOf: function(he, p) {
  var e = he.edge, topsite = e.region.r, rightOfSite = p.x > topsite.x;
  if (rightOfSite && he.side === "l") {
  return 1;
  }
  if (!rightOfSite && he.side === "r") {
  return 0;
  }
  if (e.a === 1) {
  var dyp = p.y - topsite.y, dxp = p.x - topsite.x, fast = 0, above = 0;
  if (!rightOfSite && e.b < 0 || rightOfSite && e.b >= 0) {
  above = fast = dyp >= e.b * dxp;
  } else {
  above = p.x + p.y * e.b > e.c;
  if (e.b < 0) {
  above = !above;
  }
  if (!above) {
  fast = 1;
  }
  }
  if (!fast) {
  var dxs = topsite.x - e.region.l.x;
  above = e.b * (dxp * dxp - dyp * dyp) < dxs * dyp * (1 + 2 * dxp / dxs + e.b * e.b);
  if (e.b < 0) {