TEXT FOR THE EMAIL SUBMISSION  
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Visionary Statement 
The body is an API – but who programs it?  

Contemporary approaches to the 'body' in commercial digital technology are characterised by problematic notions of 'mind-less bodies' and 'body-less minds'. Our collaborative submission addresses the question of how to configure the 'quantified self'. We approach this issue across two modes of body quantification;  

- Disembodied data, passively generated by personal health devices, is processed and circulated in closed, hidden ways before finally being re-presented to us as an abstracted 'digital self' in the form of charts and graphs that measure our bodies value.  
- Digital bodies modelled AS 3D meshes – created for gaming and animation - encourage quantified abstraction of body topologIES.  Accompanying parametric interfaces allowing 'flexibility' in the creation of 3D characters only encourages a pursuit of idealised body forms and fantasies.  

Our proposition is to develop other (proto)types of digital devices, software and infrastructure that both reveal the complex flaws behind this current context as well as proposing alternative visions for what 'self quantification' might mean. These devices - a kind of '(digital) companion species' (Braidotti, Haraway, Hayles) - should emerge from a process of 'co-evolution' that is reflective of the variable agencies of software (Fuller, Mackenzie).  

Weblink to hosted video 
????  

Collective Team 
After a week spent together at relearn (20-25th august 2015 in Brussels) working on issues around 'self quantification' we decided to answer the Sparks call as a collective team;  

Anna Carreras
carreras.anna@gmail.com
http://annacarreras.com/  

Natacha Roussel
natacha@natacharoussel.comhttp://walker.domainepublic.org/

Phil Langley
langley.phillip@gmail.com 
www.openkhana.net/  

Our CVs and detailled biographies are available, along with our submisison at; 
main project site 
https://github.com/netachepas/quantified-ars

Team memeber CVs and biographies 
https://github.com/netachepas/quantified-ars/tree/master/CVs

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++


VIdeo Script text

Proposed Title Configuring the quantified self
  
INTRO - In recent years wearables have experienced a mass market adoption,  more new devices appear and increasing the ability to connect these devices with others lets us feed applications. Nevertheless the data wereables' collect and their related applications still lack a clear and sustainable benefit to users. Collected data and it's visualization is quite boring showing charts, pies, numbers, etc. 

(images of current visualizations of 'health' data)

SCENE 1 - The measurements are hyper specific, they don't relate to a history of body representation and the self quantification completely lacks any possible poetic representation breaking human in bits (bit: as a small piece; bit as binary data).
Further research about parametric modeling of the human body pursues on previous design standardization to help designers model objects that fit different body criteria.

(images of video remake humans with the text below)

"Hi, my name is Joe. I run 4.1  miles a day at a 7:52 mile clip and I  sleep 6.6 hours a night at 55.4%  deep sleep. I have 435 friends, 1359  followers, and 1267 connections. I  went to 136 places in 24 cities last  year, and I drove an average of  32.6 miles a day to get there (at 42.2  miles per hour). In the last 30  days, I answered 28 questions, read 121  articles, and wrote another 12.  I currently rank #7 in cool points, but  also #3 in douchebaggery." 

"Hi  my name is Rachel I am parametric: set the first slider over 0.5 on the  feminine side the second slider somewhere in between 0.7 and 0.8 sets  my age. Furthermore the third slider ideally around 0.2 my weight  absolutely fixed, and 4th does not change at 0.8 for height, and 5th  absolutely perfectly on 0.5 for proportions.  Finally the 3 last  interdepent sliders define my ethnicity 0.2-0.5-0.3" 

SCENE 2 - The issues of human relations in the context of medecine is not a new one it has been addressed over time in different ways we face similar issues in the context of the self quantification practices.

(images of Dr Praetorius)

SCENE 3 SCENE 2 - The current processes directly benefits from data donation that are taking advantage of a long history of self help and support groups, but does not seem to fit into civil rights revendications associated to some of those movements among them feminists.  We feel the need to explore other approaches to this information and think about how to visualize and gamify those processes.

Issue of body representation are at stake its great to see that new tools exist and are at our disposal we want to appropriate them explore them see their potential and limits. Parametric design provides amazing possibilities to quickly represent human bodies in 3D, such as the powerful makehuman software. However the social limitations embedded in the interface design are limits the representations to some abstract and space less vision of the body.

(images of remake human)

SCENE 4 - OUR PROPOSAL. 
To the benefit of our health there is more to walking than step data, 
It helps being actively counscious of surroundings; by an  instantantaneous sense of perception. We propose not only to visualise differently, but also to envision different ways of capuring data, information maybe going through different mediums could be useful, ex transform photot into sound.

We propose to address the representation of self quantifiers with more relevant body tracking information, looking for a more holistic representation and ways to capture and illustrate subbtle and contextual information.



>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
'original' script for reMakeHuman...this used 'text-to speech' (it is 350 words and lasted for 2min 30sec) 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Destabilizing Make human
first of all down load make human source code from bit bucket.
some examples of how to destabilize make human

changing the language.
you can change the language of the software - use your own imagination. look inside the data folder and then the language folder to choose which language you would like to modify. you can use grep to recursively search files. 
simple.

changing the colors.
if you are uncomfortable with the ethnicity sliders, you can change them. all you need to do is edit the image files that are used form texture. these are located in the lit spheres folder. we have changed them using gimp to be red green and blue. this means you can make any color model you want. 
nice.

playing with the defaults.
if you are not happy with your start up model being made from so called averages, you can randomize it. edit the human python file and replace the default middle values with random numbers. 
easy.

choose some different icons.
if you dont like the icons inside make human you can always change them to something else. look for the targets folder inside the data folder. each of the body parts has its own icon set inside. you can just edit or change the image files.
no problem.

parametric decoupling.
this sounds complicated, but do not worry. we want to affect the relationship between the parameters and to increase the space of possibilities. inside the human modifier python file we have removed the part of the code that updates the related modifiers. it is in the apps folder. between lines 228 and 237. this will give you some unexpected outcomes. 
wow.

give an identity to your modified software.
once you have personalized your software, you should give it an identity. if you search for the splash image inside the theme folder you can edit it or replace it with an image of your own. 
beautiful.

congratulations. you have hacked make human.
you can fork the software on bit bucket and make your own, or you can follow our bug report that explains our suggestions.
good luck.