Friday, June 29, 2018

Blocking load of third party scripts in Blogger until cookies agreed for GDPR Cookie compliance.

More than a month after GDPR came in to force it is still a hot topic. Many people have discovered that their own simple hobby blogs fall within the scope of GDPR compliance and have struggled to understand exactly what they need to do to remain on the right side of the law.

There are plenty of blog posts and guidance sites out there now offering advice on banners, privacy statements, and what it means to be a data controller, which will help the typical blogger navigate this minefield.

But there is one area which even for tech-savvy bloggers is still problematic - that of the rule that no cookies other than a basic non-data based session cookie can be written without the express prior consent of a site visitor from a GDPR country.   Blogspot which host many of my sites have been fairly good about its own control of writing of cookies (though the navbar in some templates still occasionally leak cookies while the cookies banner is on show for some reason??). What they don't cover - simply because it's really not their problem - is control of cookies being written by any third party gadgets that bloggers add to their sites eg: adverts, twitter feeds, social media tools and affiliate product linking.

I suggest at this point all blog owners pay a visit to https://www.cookiemetrix.com/ and type in the url of their blog to see if cookies being written and general GDPR compliance is an met. Checking you site via proxy sites is NOT a safe way of determining this as numerous tests have shown that what is served by proxies can be very different to a direct connection due to their own internal processes. Use a dedicated free GDPR test site like CookieMetrix.

If third party cookies are an issue - read on!

OK.... first a bit of background. How do bloggers normally add a third party script or widget to their Blogspot site?  Well, they could roll up their sleeves and directly edit the HTML/XML code for their templates, but the vast majority will add a Blogspot HTML/Javascript Gadget:







Easy enough... but provides no control over what that script then goes on to do as far as cookie control goes.

What my solution does is to use the same process, but instead create a placeholder for where these scripts will go. This allows us to defer loading those scripts until we have checked if the visitor needs to give cookie consent, and if so that they have.




So how does this work?  Well, most people with a little background in HTML/CSS/Javascript will realise that you can create a <div> tag, give it a unique ID and modify it later. it is the basis of pretty much all dynamic page content. See https://www.w3schools.com/js/js_htmldom_elements.asp for examples etc

The problem is that while you can retrospectively change the style, the text content and most HTML tags within an element, any scripts you try to add either by innerHTML or appending child nodes  will not be executed because the page load has finished and also as a security measure to prevent possible cross-site scripting exploits.... Bugger!

To solve this we need to use some clever asynchronous reading and writing of the script to the page which allows the scripts to still be parsed after the page has finished loading.

I spent quite a bit of time looking at ways of doing this, and in fact for several of my other sites found a solution that works well with most third party includes... but some third party scripts still proved problematic - ones using document.write.

Having gone round in circles for some time I re-discovered in my notes on useful libs Postscribe https://krux.github.io/postscribe/ which actually solves this problem very simply. Yay!

A few quick tests confirmed that it was possible to use postscribe and a little jQuery to retrospectively add executable scripts long after the page load has finished....

And that forms the basis of the system I now use.

So, lets first describe how to use my system to add third party scripts that are deferred until cookie compliance agreement is met, and the I will break down the code to explain how it works.



First, thing you need to do is pop over to https://ipdata.co/  and click on the "Get a free API key" button - because we will be using their API to work out if the visitor is from a GDPR country. You will be emailed a key which you need to include in the code that follows. A free API key gets you 1500 free location checks a day. (You can sign up to a paid solution if your site has more traffic, or you could look at using cookies to save the visitor country code once cookies are  agreed - to limit the number of API calls needed - but that is beyond the scope of this blog post.)



Next, we are going to be using both jQuery and Postscribe libs, so you need to add them to the <head> of your site. To do this open up the theme and click on edit HTML



... And then paste :

<script src='https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js'/>
<script src='https://cdnjs.cloudflare.com/ajax/libs/postscribe/2.0.8/postscribe.min.js'/>

immediately after the opening <head> tag.


Then click on Save Theme to finish.  This means that when your page loads, those two javascript libraries are going to be available for our code which will be further down the page. 

So far so good.


Now, rather than adding the third party code direct to a Blogspot gadget as you would normally, we use the placeholder div giving it a unique id eg: myPlaceHolder1  (you can add as many as you like, but they all need a unique id). using:

<div id="myPlaceHolder1"></div>




Next, create a HTML / Javascript gadget as close to the bottom of the page as you can :




and paste in the following code:



<script>
	

function showDivs(){

   // this is the bit you need to edit =============================================================================================

   // (see blog post for details)

   // add amazon search widget
   $(function() {
     postscribe('#myPlaceHolder1','<script type="text/javascript">amzn_assoc_ad_type ="responsive_search_widget";.... etc .... <\/script>');
   });

   // add twitter widget
   $(function() {
     postscribe('#myPlaceHolder2','<a class="twitter-timeline" data-width="300" data-height="600" data-theme="dark" .... etc .... <\/script>');
    });

    //etc...

}

// array of country codes where GDPR cookie agreement is needed (source : https://www.hipaajournal.com/what-countries-are-affected-by-the-gdpr/)
// if other countries needing a cookie accept add the country code to this array 
var reqGDPR = ["AT","BE","BG","HR","CY","CZ","DK","EE","FI","FR","DE","GR","HU","IE","IT","LV","LT","LU","MT","NL","PL","PT","RO","SK","SI","ES","SE","GB"];





//functions and globals 

// set up a interval timer for five second intervals - this will repeatedly  call the function pollCookieAgree that checks for the agreement cookie
// being written which shows that the visitor has agreed to the gdpr cookies terms or it is cancelled due to this
// not being a visitor from a gdpr country
var ourInterval = setInterval("pollCookieAgree()", 5000);


//general purpose cookie reading function
function readCookie(name) {  
            var nameEQ = name + "=";
            var ca = document.cookie.split(';');
            for (var i = 0; i < ca.length; i++) {
                var c = ca[i];
                while (c.charAt(0) == ' ') c = c.substring(1, c.length);
                if (c.indexOf(nameEQ) == 0) return c.substring(nameEQ.length, c.length);
            }
            return null;
  }



//  interval based listener function - called until cancelled by ourInterval further up the script
function pollCookieAgree(){
	
	// call the readCookie script and grab its returned value
	var myCookie = readCookie("displayCookieNotice");


	if (myCookie==="y"){
	
		// yes - we have found the cookie that shows cookies have been agreed to
		console.log("found the cookie acceptance cookie");
		

		//OK, we have found the cookie that shows that the cookie banner has been accepted


		//we dont need the timer any more
		clearInterval(ourInterval);
    
		// safe to show the divs 
		console.log("Calling the jQuery and postscribe async read and insertion of the ads");
   
   
		// we can now use a combination of jQuery and postscribe to proxy to asynchronously modify 
		// each each of the placeholders we set using the function showDivs() which we edited at the top of this script
		showDivs();

	}

}



// This is where the main activity takes place!

// lets find out where our visitor is from. We put in a jQuery call to the geolocation API @ ipdata.co to get the users location data
// This returns a JSON formatted block ofdata based on the approximate location (by ipaddr) of the visitor Remember to add your API key!


// No, thats not a typo-----------------------------------------v - the closing ) is down at the bottom of the script
$.getJSON('https://api.ipdata.co?api-key=01234567890123456789',
 
 
	// we set up a callback for the response
	function(geodata) {
        
        
        // dump the result out to the console log for debug  - always handy!
        // there is quite a bit of useful info there - but remember we are trying to meet GDPR!!
        // You really should do nothing else with this data unless your visitor has specifically agreed to it.
        console.log(geodata);

		// now see if the recovered country code (held in geodata,country_code object) 0ccurs in the array of gdpr codes 
		// which we declared right at the top of the script 
		if (reqGDPR.includes(geodata.country_code)){
	
	        // Yes :: OK this is a GDPR situ!   
	        
	        // make a note to the console log that we have realised the visitor is from a GDPR country
            console.log("GDPR visitor from : " + geodata.country_code);
            	
			
			// need do nothing else as the timer based cookie check function will run every five  seconds to check
			// if cookies policy has been accepted, and will not call the ads until that is confirmed 
			  
	
  
		}else{
	
			// Nope, looks like we are good... this visitor is not from a place where GDPR applies : go get those ads!
	
			//we dont need the interval timer any more
			clearInterval(ourInterval);
	
			// we can now use a combination of jQuery and postscribe to proxy to asynchronously modify 
			// each each of the placeholders we set using the function showDivs() which we edited at the top of this script
			showDivs();
	
		}
	
	
	}
	
	
);	
</script>







First, edit the section about half way down :

$.getJSON('https://api.ipdata.co?api-key=01234567890123456789',

Replacing the highlighted section with your own API key.


Then take the third party script that was provided by twitter or whoever, and edit the code that you have just pasted, replacing the highlighted section between the second set of single quotes with the third party script that you want to include.

// add twitter widget
   $(function() {
     postscribe('#myPlaceHolder2','<a class="twitter-timeline" data-width="300" data-height="600" data-theme="dark" .... etc .... <\/script>');
    });

 Repeat for however many third party scripts you have.

IMPORTANT:  you will need to escape the closing script tag in the third party code if there is one.  eg: if the code you pasted from the third party site contains </script>  you need to add a backslash so you end up with <\/script>

Similarly, if the script you paste in has any single quotes in it, you will need to put a backslash before them.

That's it...  that is all you need to do. each time you add a third party script or widget to your site, create a placeholder div, and copy the block of code above, editing the placeholder id and adding the third party script as described.

Your site will now check if the visitor comes from a GDPR region and if so, wait for them to agree too cookies before loading the third party  scripts. If not, they will load up a few seconds after the page is loaded.


And now, the description of how it works...

Lets look at this segment first:




// set up a interval timer for five second intervals - this will repeatedly  call the function pollCookieAgree that checks for the agreement cookie
// being written which shows that the visitor has agreed to the gdpr cookies terms or it is cancelled due to this
// not being a visitor from a gdpr country
var ourInterval = setInterval("pollCookieAgree()", 5000);

.
.
.
.
.


//  interval based listener function - called until cancelled by ourInterval further up the script
function pollCookieAgree(){
	
	// call the readCookie script and grab its returned value
	var myCookie = readCookie("displayCookieNotice");


	if (myCookie==="y"){
	
		// yes - we have found the cookie that shows cookies have been agreed to
		console.log("found the cookie acceptance cookie");
		

		//OK, we have found the cookie that shows that the cookie banner has been accepted


		//we dont need the timer any more
		clearInterval(ourInterval);
    
		// safe to show the divs 
		console.log("Calling the jQuery and postscribe async read and insertion of the ads");
   
   
		// we can now use a combination of jQuery and postscribe to proxy to asynchronously modify 
		// each each of the placeholders we set using the function showDivs() which we edited at the top of this script
		showDivs();

	}

}





Basically we are creating a "listener" script. The initialization -

var ourInterval = setInterval("pollCookieAgree()", 5000);

Sets up a recurring timer to call the pollCookieAgree function once every five seconds. (This will execute forever until it is cancelled).

This in turn uses the generic cookie reading function readCoolkie(name) to check for the presence of a cookie called "displayCookieNotice". (This will have been written out by Blogspot once a visitor has agreed to the cookie banner terms.)

If it is not found, the function does nothing and waits until it is next called by the interval timer, or is cancelled externally.

If it does find the cookie however, It is safe to continue loading the third part scripts. It first cancels the interval timer as it is no longer needed, then after a log out to the console, calls our function showDivs().


	

function showDivs(){

   // this is the bit you need to edit =============================================================================================

   // (see blog post for details)

   // add amazon search widget
   $(function() {
     postscribe('#myPlaceHolder1','<script type="text/javascript">amzn_assoc_ad_type ="responsive_search_widget";.... etc .... <\/script>');
   });

   // add twitter widget
   $(function() {
     postscribe('#myPlaceHolder2','<a class="twitter-timeline" data-width="300" data-height="600" data-theme="dark" .... etc .... <\/script>');
    });

    //etc...

}



This does all the hard work of re-inserting the deferred third party scripts using postscribe.

postscribe('#myPlaceHolder1','<script>.... etc .... <\/script>');

Calls the script source held between the second set of single quotes via postscribe, buffers it and does an async child append to the div target between the first pair of single quotes. Problems with things like scripts with document.write etc are taken care of. Simple as that :)

But...!!!

That is only half the story. If a visitor from a non-GDPR country arrived, the cookie checking script would just loop forever - Blogspot only writes that cookie when someone agrees to the GDPR banner.

We need a second line of attack...

This is where the ipdata API call comes in.




// array of country codes where GDPR cookie agreement is needed (source : https://www.hipaajournal.com/what-countries-are-affected-by-the-gdpr/)
// if other countries needing a cookie accept add the country code to this array 
var reqGDPR = ["AT","BE","BG","HR","CY","CZ","DK","EE","FI","FR","DE","GR","HU","IE","IT","LV","LT","LU","MT","NL","PL","PT","RO","SK","SI","ES","SE","GB"];

.
.
.
.

// lets find out where our visitor is from. We put in a jQuery call to the geolocation API @ ipdata.co to get the users location data
// This returns a JSON formatted block ofdata based on the approximate location (by ipaddr) of the visitor Remember to add your API key!


// No, thats not a typo-----------------------------------------v - the closing ) is down at the bottom of the script
$.getJSON('https://api.ipdata.co?api-key=01234567890123456789',
 
 
	// we set up a callback for the response
	function(geodata) {
        
        
        // dump the result out to the console log for debug  - always handy!
        // there is quite a bit of useful info there - but remember we are trying to meet GDPR!!
        // You really should do nothing else with this data unless your visitor has specifically agreed to it.
        console.log(geodata);

		// now see if the recovered country code (held in geodata,country_code object) 0ccurs in the array of gdpr codes 
		// which we declared right at the top of the script 
		if (reqGDPR.includes(geodata.country_code)){
	
	        // Yes :: OK this is a GDPR situ!   
	        
	        // make a note to the console log that we have realised the visitor is from a GDPR country
            console.log("GDPR visitor from : " + geodata.country_code);
            	
			
			// need do nothing else as the timer based cookie check function will run every five  seconds to check
			// if cookies policy has been accepted, and will not call the ads until that is confirmed 
			  
	
  
		}else{
	
			// Nope, looks like we are good... this visitor is not from a place where GDPR applies : go get those ads!
	
			//we dont need the interval timer any more
			clearInterval(ourInterval);
	
			// we can now use a combination of jQuery and postscribe to proxy to asynchronously modify 
			// each each of the placeholders we set using the function showDivs() which we edited at the top of this script
			showDivs();
	
		}
	
	
	}
	
	
);	







The first thing we do is build an array of the two char country codes for every country that implements GDPR

var reqGDPR = ["AT","BE","BG","HR","CY","CZ","DK","EE","FI","FR","DE","GR","HU","IE","IT","LV","LT","LU","MT","NL","PL","PT","RO","SK","SI","ES","SE","GB"];

Next we use jQuery to get a JSON formatted object back from the ipdata API and set up a callback:

$.getJSON('https://api.ipdata.co?api-key=01234567890123456789', // we set up a callback for the response function(geodata) {...}

(See https://g7nbp.blogspot.com/2018/04/using-jquery-and-ipdata-api-to-serve.html for more info).

if (reqGDPR.includes(geodata.country_code)){....}

we compare the returned geodata.country_code to our array of GDPR countries, and if there is a match, we do nothing... because our interval timed script will be along shortly to keep checking for the agreement cookie.

If we dont find a match then we realise that this is not a GDPR country anyway, so cancel the timer and call showDivs() to begin writing out the deferred thrid party scripts to the placeholder divs...

Jobsagoodun!

So, we have catered for both GDPR visitors who may or may not have agreed to cookies, and to non GDPR visitors.

I hope that makes sense, and that you can use a variant of the above to solve any third party script issues you may have.

Tuesday, June 26, 2018

Dev work ongoing...

OK, a brief update. EU visitors to my site should now be first greeted by the cookies banner, and only if they agree to it will Amazon and twitter be loaded. Yay! The work still remaining is to build in the script I posted a few weeks back which decides if this is needed by country code of the visitor. This is because the blogger cookie I am using as the trigger for GDPR cookies acceptance is only written for EU visitors, so that alone cannot be relied upon alone. ie: non eu visitors at present are not going to see ads, resources and the twitter feed. A full post with worked examples will follow.

Sunday, June 24, 2018

Updates and missing sidebar stuff...

Just a quick post as it's 3:20am and I've been working on this since 8:00pm last night!

Regular visitors will no doubt have noticed that my sidebar widgets have gone. The reason for this is quite simple. They write cookies without asking. As most people will realise, this is a no-no under GDPR.

Blogspot usefully provides a cookie that shows when a visitor agrees to the cookies usage banner and I've fixed up a fairly straightforward listener script that checks for the presence of this cookie. It works fine...   

The  idea is that where the widgets are I've put simple placeholder div tags and that once my listener verified a visitor has agreed to the use of cookies, to update the placeholder div tags with the third party provided scripts and thus load the Amazon and Twitter feeds.

The problem however is that it's not quite that easy....

Firstly, you can't just fudge it using document.getElementById and innerHTML to modify the div content, because browser security blocks this to prevent cross site scripting attacks. You have to do it properly using the DOM. 

This means using something along the lines of

var sc = document.createElement("script");
sc.setAttribute("src", "https://some site.com/somescript.js");
sc.setAttribute("type", "text/javascript");
document.getElementById("myPlaceHolder").appendChild(sc);
Which appends a script tag within the div itself. 

This pretty much works for most external scripts, and I've used this to good effect on several other sites....

But...

With both the Amazon and Twitter feeds the script that is being called uses document.write to create content. This is bad... Bad because firstly it's a blocking method. As a synchronous write insertion it holds up the page load normally, but as we have now waited until after the page has loaded, and the visitor has agreed to cookies activating my listener script and kicking off the writing off the createElement script, the page has long ago loaded.... And document.write can no longer add to the current div... Doh!

What is needed is an asynchronous method of writing to the placeholder, but of course I have no control over the content of the third party script.

This makes life a lot more complicated!

So, at present I'm looking at using jQuery or an Ajax script to act as an asynchronous buffer for the third party script. This should solve the problem of insertion after page load is complete, but adds a lot of complexity.

I will do a full write up when this is finished, as I'm sure there will be a number of bloggers using Blogspot who are in the same position - having third party scripts that write cookies, but that they have no control over.

In the meantime, sidebar widgets will be off and the site may be a little odd while I run these tests.

As the BBC used to say: "Normal service will resume shortly, please do not adjust your sets" :)

More later...

Friday, June 22, 2018

Keeping older hardware alive and useful

If you are anything like me, you will have a few old PCs and laptops that are getting a bit long in the tooth, perhaps only 32bit architecture. But you can't quite bring yourself to dump them yet. Even though a modern windows install or even most of the recent mainstream Linux distros probably wont even boot, or would be painfully slow... 

... But all is not lost, there are still up-to-date, feature packed Linux distributions that can breath new life into your older systems.

Right now Im creating this blog post using one of my several aging Sony Vaio laptops that are more than a decade old.  Current spec is a 2GHz cpu (pentium 4 mobile) with 2Gigs of ram. It has a fairly respectable 1600x1200 resolution display and supports a second monitor to further expand the desktop - so is a useful machine for development work. Its only real limitation is being a P4.

Despite this handicap, this is how my desktop looks:


(you can click any of the images to see them full size)



A full featured GUI with transparency, shadowing and all the most up-to-date features and software packages found in a Linux distro.


How have I achieved this?  BunsenLabs Helium a lightweight debian based Linux distro, released in both 32bit and 64bit architecture. It quite happily installs on machines with as little as half a gig of ram and just a few gigs of disk space. Despite this, it is highly customisable, and very quick in use.

A few quick menu based tweaks gets you the attractive grey theme, edits the on-screen system info (conky) and the top bar (tint2) and adds the background image. I downloaded a mono-grey icon set (as I wanted a largely mono feel to the desktop) which simply has to be placed in /usr/share/icons/ and other than that, its pretty much stock. You can save all the config changes you make using the BLOB theme manager:


Which makes experimenting with settings and rolling back to an earlier look and feel when things go astray real easy.


So does it work...?

Well yes, I'm editing this using latest version of chrome (which also has a dark theme added)


And despite having a nice desktop with animations, blending and shadowing running, along with half a dozen chrome tabs open, and a VNC session to my development server and some music playing I'm still using only just over half  a gig of RAM and very little CPU.

I'm not going to bother writing up a full howto as there is plenty of info on the BunsenLabs site and in the help forums. In fact all the info you need is actually provided withing the base install itself as it has very comprehensive help menus. Being based on latest stable debian, there is a wealth of info available as well. You really cant go wrong with this!

Instead, just a few more screenshots:


Using Geany editor / IDE for HTML editing and other coding



Yes, that's the full version of GIMP with all the extras!



And that's a VNC remote desktop session running to my development server (which is also running Helium with only half a gig of RAM!)


As lightweight distros go, I really can't fault this. It is much lighter in footprint than Lubuntu which was my previous lightweight distro of choice. Blog followers will realise I'm not one to be easily swayed in my opinions when it comes to distros, but I'm sold.

If you need a lightweight distro, they dont come much lighter than Helium!

Sunday, April 15, 2018

Using JQuery and the IPData API to serve content based on locale

Have you ever wanted to be able to greet visitors to your blog or project website with a personalised greeting in their own language, or to offer content or advertising links based on the country that a visitor to your site is coming from?

If so, this short tutorial post might well help you along the way with that.

You will need to be able to edit the actual HTML content of your site or be able to paste in a and HTML/Javascript block - (which blogger supports either as a widget, or as a code edit via the main interface).

You will first need to visit http://ipdata.co and click on the button to get a free api key which will allow you to make 1500 queries per day.

The code below is pretty much all you need to get started, though you will obviously need to edit it both as far as the countries codes and the actual content you want to serve goes.


<div id="UK" style="display:none"><h1>UK based content goes here</h1></div>
<div id="US" style="display:none"><h1>US based content goes here</h1></div>
<div id="FR" style="display:none"><h1>FR based content goes here</h1></div>
<div id="other" style="display:none"><h1>other language based content goes here</h1></div>



<script src="https://code.jquery.com/jquery-3.2.1.min.js">

//load jquery which makes this much easier! 

</script>



<script>

    // put in an API call to ipdata.co to get the users location data
    $.getJSON('https://api.ipdata.co?api-key={your api key}', 

function(geodata) {
        // dump the result out to the console log for debug 
        console.log(geodata);


  // now we can make some choices (you can get a fulllist of country codes here : https://en.wikipedia.org/wiki/ISO_3166-1)
  switch(geodata.country_code) {
    case "GB":
        //country code was == to "GB"
        console.log("I say old chap!");
        // toggle this visibility of the GB specific content div
        UK.style.display="block";
        break;
    case "US":
        console.log("Howdy partner!");
        // toggle this visibility of the US specific content div
        US.style.display="block";
        break;
    case "FR":
        console.log("Bonjour!");
        // toggle this visibility of the FR specific content div
        FR.style.display="block";
        break;

    // add more to suit if needed!
    
    default:
       // toggle this visibility of the catchall div 
       console.log("Hello!!");
       other.style.display="block";
  }

});
</script>



Although the above is pretty self explanatory I will break down exactly what is happening a little:

Firstly create a div for each language block, with its ID set to match the expected 2 char country codes. The important part is to set the style as being   display:none   which means by default they are all hidden. When the script runs and pulls back a country code, the case switch block will change the visibility of one of your divs from  none to block and will thus make just that div visible.

How?  Well,  next we call the jquery API.   If you are not familiar with this API, it makes things like asynchronous  (background) calls to other sites and APIs so much easier. I did start out writing this test in pure javascript but quickly decided that the extra overhead of using jquery is worth it in terms of simplicity, and lets face it blogger loads so much other stuff that one more lib isnt going to break anything!

Once jquery is loaded we can then use its $.getJSON call to grab the output from https://api.ipdata.co

This grabs the visitors ipaddr (and other details) and returns it in a JSON formatted block.


if you visit https://api.ipdata.co in your browser you will see what the return is :
{
    "ip": "x.x.x.x  "city": "London",
    "region": "England",
    "region_code": "ENG",
    "country_name": "United Kingdom",
    "country_code": "GB",
    "continent_name": "Europe",
    "continent_code": "EU",
    "latitude": 51.5142,
    "longitude": -0.0931,
    "asn": "AS60339",
    "organisation": "BT Internet"; "postal": "EC2V",
    "currency": "GBP",
    "currency_symbol": "\u00a3",
    "calling_code": "44",
    "flag": "https://ipdata.co/flags/gb.png",
    "emoji_flag": "\ud83c\uddec\ud83c\udde7",
    "time_zone": "Europe/London",
    "utc_offset": "+0100",
    "is_eu": true,
    "suspicious_factors": {
        "is_tor": false
    }
}

Essentially a list of parameters about your location - based on your public ip address, in a JSON format.  If you are not familiar with json have a look at https://en.wikipedia.org/wiki/Json  basically json is a language independent data format much like XML, which is very commonly used in web  systems to communicate data between systems using API calls. It obeys a machine and human readable structure meaning that you can look at its output and understand it, and more importantly javascript can parse it and extract data.

As commented, you get 1500 free api queries per day, so unless your site has a lot of traffic, this is pretty much fire and forget as far as the api goes.  If you do, you might consider writing out a cookie with the visitors country code when they first arrive, and check for its presence before calling the api for the geodata to save traffic.


(By the way,   $.getJSON('https://api.ipdata.co?api-key={your api key}',       is not a typo - if you are looking for the closing bracket its down at the bottom of the script)

You are probably wondering how we get the return from that?

well the function call on the next line :  function(geodata){    collects the return from the api-call and assigns the raw json object to the variable geodata.



We dump that out to the console.log for debugging purposes. Your browser will usually offer you the ability to inspect the page and give you console log data as part of its output, but that is beyond the scope of this tutorial. If you need more info about this google is your friend :)

OK, so we assume at this point (and you might want to trap some errors if you are an advanced user and make no assumptions!) that geodata is now a JSON formatted  object. This means we can use simple object orientated code to get at the info that we need - ie: the country code

And to do this we assume that geodata.country_code will contain the two char international country code conforming to ISO_3166-1, and thus we can make a simple  case-switch decision tree using:

 switch(geodata.country_code)

and provide a case for any countries we want to serve specific content for using:


case "GB":
        //country code was == to "GB"
        console.log("I say old chap!");
        // toggle this visibility of the GB specific content div
        UK.style.display="block";
        break;

for each country code. where the case statement matches the country code, the console log simply records a suitable remark to identify the country.

The important part is UK.style.display="block";

This changes the visibility of one of the divs lower down to block, meaning it suddenly becomes visible, whereas all its counterparts remain hidden.


If none of the case statements find a match, then the default scenario is launched by making the div with the "other" id visible. This allows you to have a catchall greeting/block/advert in place for visitors from parts of the world you just didn't figure would pay your site a visit.

and that's about it as far as the javascript goes.

That all sounds complicated, but really isn't

You can experiment with any of the key and value pairs eg: currency and GBP in the same way, but beware of the actual lon/lat info reported - Im certainly not in London right now, though my ISPs core switch is. If you need finer grain geolocation info then you can look at the HTML5 geolocation system, but be aware that unlike the above script which is silent, that asking for more high resolution geo loc info via html5 will bring up a prompt in the browser for a permission.

I hope that's a help getting started with your location specific greetings  I will leave you with https://www.omniglot.com/language/phrases/hello.htm   and the request "do no evil". This script is not to discriminate, but to make more inclusive. I have travelled through many counties in the world now, and been welcomed in all. The sooner we realise we are all citizens of earth and that artificially created boundaries and differences  are meaningless, the better for all.

Wednesday, September 18, 2013

A Nostradamus moment?

Just a couple of days back I wrote a blog post entitled MOOCs? where I also drew reference to a blog post I wrote a couple of years back where I discussed my worries that the current ICT curriculum in UK schools is not a substitute for learning Computer Science : ICT is not CS!  Which seems to have almost been prophetic when two directly related posts cropped up on the web yesterday.

The first web announcement was spotted on develop-online.net a website dedicated to development of software, games, and interactive media, which highlights the new government outlines for the new Computer Science course which is to replace ICT http://www.develop-online.net/news/45378/UK-Govt-outlines-computer-science-curriculum Which very much echos my thoughts and hopefully addresses the shortcomings of ICT as it is at present. The new curriculum  aims to deliver a structured introduction to computer science starting at KS1 and adding new concepts as well as building upon principles through to KS4.


           "The government has outlined the new curriculum for computer science in schools.

A program has been drawn up for Key stages one through four, which will mean students will begin learning computer science from early primary school and then throughout their education.

The statutory guidance states the aim of the national curriculum for computing is to ensure students can understand and apply the fundamental principles and concepts of computer science, including abstraction, logic, algorithms and data representation.

Pupils will also be taught to analyse problems in computational terms with practical experience of writing computer programs to solve them, to evaluate and apply information technology analytically to solve problems, and also ensure pupils are responsible, competent, confident and creative users of such information and communication technology.

The publication stated that such a “high quality education equips pupils to use computational thinking and creativity to understand and change the world”.
The new GSCE subject in computing will replace ICT from September 2014."


 
Reading through the key stage subject contents listed, this certainly looks like an improvement on the existing ICT syllabuses and will be welcomed I feel by both pupils and some educators.... 

....But there lies one of the problems...

With a course of this type, the educator - the person delivering the material will really make or break the learning experience that children have at all levels. The GCSE Computer Science syllabus really needs someone a little more "geeky" to deliver it well, someone who has a good understanding of the concepts to be delivered and an active interest in the area, as well as general teaching skills and the enthusiasm for the subject that can be passed to the pupils. I have no doubt that some - possibly many existing teachers of ICT will be more than capable of meeting the demands... But I am equally sure that many may struggle. Which sort of brings me round to my point...  How can these complex subjects be brought home to pupils when the teachers may not be fully up to speed themselves on the subject?    - MOOCs may be the answer... High quality interactive online learning material produced by the examining entities themselves. ??

...And this isnt perhaps out of the question...


The second web announcement yesterday is that Cambridge University Press announce that their GCSE Computing MOOC goes live at the end of the month (September 30th). This is a OCR accredited GCSE in computing course thats available and supported online right now, based around learning using the Raspberry PI platform.  http://www.cambridgegcsecomputing.org/about-us 

             "It's a GCSE, but not as you know it... Computing rules the world, or at least a large part of it. Cambridge GCSE Computing Online will provide free and open access to OCR’s GCSE in Computing, supported by resources from the Raspberry Pi Foundation and Cambridge University Press. Together we’re busy creating a ground-breaking site to help you make sense of the technologies and opportunities this amazing vehicle offers in industry, education and every aspect of our daily lives. Through a mixture of videos, animations and interactive exercises, the content is being designed to challenge and inspire you. We know that studying Computing is about using creativity and problem-solving to unlock opportunities all around you, inside the classroom and far beyond it."



This will be Freely available  (Free as in beer to use the linux analogy) to schools to help deliver this qualification and will provide a valuable resource, but more crucially is also available to students already in education worried that they have been disadvantaged by the current ICT curriculum. But they would have make their own arrangements to sit the exam - and it is important to stress that :  

             "No OCR GCSE Computing certificate will be available direct through the Cambridge GCSE Computing Online website but the content will help students prepare for the exam." 

So individuals interested in this qualification would need to see if a local school or college was  able to host the final examination.  Im sure that when this takes off a number of local groups will emerge, and due to the Raspberry Pi involvement it will be a hot topic in local Hackerspaces and groups, attracting interest from enthusiasts so finding enough people in a geographic area to make it worth while putting on an exam, or perhaps travelling to where one is running should not be out of the question.


In my opinion this has been a looooong time coming, and will be too late for some, but at least now there is light at the end of the tunnel for those students who are frustrated in their efforts to learn "Computing" rather than having "ICT" thrust upon them.


Sunday, September 15, 2013

MOOCs?

Nearly two years back I wrote a blog post entitled ICT is not CS! where I wrote about some of the reasons why I feel that the UK (and some other countries too) are falling behind with education in computer science. An article which caused  discussion in some circles, and has indirectly lead to several projects that I have heard about where others with similar worries have stepped in to run various projects with schools in their own areas.  Thanks to everyone who gave feedback, and especially to those who went on to do something positive.

Thankfully the situation in the UK is improving now, despite resistance and scepticism from some. There are a number of initiatives that have featured in the national press which seem to be slowly filtering into place now. There are also a growing number of technology education based events throughout the country now, though some of them take a little finding out about unless you follow the right news feeds on twitter and facebook. But they are out there!

Sadly though, its still not true today throughout the world. There are still so many places where Science and technology education is scarce or non-existent...  But thanks to widespread availability of the internet, this may not always be the case. Even if there is no local source of science and technology education there have been for a year or so now a wide range of on-line education systems developed by some of the worlds leading universities delivered and  graded for free simply to benefit the world of education. I refer of course to MOOCs - (Massive Open Online Courses).

Today I read this: http://mobile.nytimes.com/2013/09/15/magazine/the-boy-genius-of-ulan-bator.html?ref=magazine&_r=0&  which is what has prompted this post...  (credit to adafruit for the orignial tweet which drew it to my attention)

It describes the achievements of Battushig Myanganbayar from Mongolia, a country where "a third of the population is nomadic, living in round white felt tents called gers on the vast steppe". At the age of 15 "became one of 340 students out of 150,000 to earn a perfect score in Circuits and Electronics, a sophomore-level class at M.I.T. and the first Massive Open Online Course, or MOOC — a college course filmed and broadcast free or nearly free to anyone with an Internet connection — offered by the university."   The article also goes on to descibe how "Battushig’s success also showed that schools could use MOOCs to find exceptional students all over the globe. After the course, Kim and Zurgaanjin suggested that Battushig apply to M.I.T., and he has just started his freshman year — one of 88 international students in a freshman class of 1,116. Stuart Schmill, the dean of admissions, said Battushig’s perfect score proved that he could handle the work."

There can be little doubting therefore the value of these courses not only in developing countries, but the world over.

This has however raise some interesting questions about the future of education in some circles... Does this spell the beginning of the end of traditional education as some have suggested? Is it a "fad"?   Somehow I doubt that...  We have had the "Open University" Here in the UK for a number of years (albeit not for free), yet still see record attendance figures at our colleges and universities still.  I feel I should add that had it not been for the OU TV programs on the BBC back in the 70's and 80's I almost certainly would not have the interests in computing, science and technology I have now.  I tend to view these MOOCs courses in the same way - as a source of inspiration for the scientists and engineers of tomorrow.

A handful of UK colleges and universities are now offering a limited range of MOOCs, but  I think its time a lot more looked towards the production of their own MOOCs as not only a public service, but as a way of finding the brightest and best students who may otherwise not have found a way to their doors. And more importantly aiming them not only at school leavers - but at those in secondary education too... Especially if those courses had national accreditation comparable to the courses currently offered in the national curriculum.

It really is time the everyone started making education cheaper, more accessible and more meaningful to the future aspirations of our children - MOOCs may well be the means to do that.











Links:

http://en.wikipedia.org/wiki/Massive_open_online_course

http://www.openculture.com/free_certificate_courses


http://www.bdpa-detroit.org/portal/index.php?Itemid=20&catid=29:education&id=57:moocs-top-10-sites-for-free-education-with-elite-universities&option=com_content&view=article

http://ocw.mit.edu/index.htm


Monday, September 02, 2013

Time to look at a balloon project again?


Just a quick blog post from my phone, more as a "note to self" as a future project than a full write up, but perhaps it may also serve as a starting point for others who follow my blog to begin their own research...

I've looked at and dismissed a high altitude balloon project several times in the past. There have been a number of reasons, but not least of these has been the costs involved - the latex balloon, large volume of helium required to fill it and the risk of losing an expensive payload all make for an expensive flight.
However a few news posts and blog entries on the web recently have found their way to my attention, suggesting that there is a lower cost entry route to this fascinating area of research.

I refer of course to "pico balloon projects".

Unlike the more frequently reported "high altitude" flights which carry complex payloads, usually equipped with cameras to "near space" heights of around 30km using a meteorological balloon and parachute recovery system, Pico balloon projects carry a very lightweight payload - typically just a few tens of grams - and are lifted using inexpensive foil "party" balloons (£3.95!!). They make a maximum altitude of just a few km, but can under ideal conditions make lengthy flights.

The weight restriction on the payload is governed by the very limited lift available from the balloon itself, but should not be viewed too negatively as with a little ingenuity a lot of technology can be packed into a 50g mass. It does also mean that the overall cost of the payload is lower, an advantage if recovery is uncertain.

From the little bit of research I've done, it seems there are some "off the shelf" boards available (pcb designs and firmware are open source). Personally, I would probably want to develop my own flight computer - probably using an arduino as the test platform, then building a cut down version using just the required components only (which looks to be the way others have gone too).
Bellow are a very few quick links to get you started...

edit (I forgot this one!) : http://amsat-uk.org/2013/09/01/long-duration-434-5-mhz-balloon-launched/

http://ava.upuaut.net/?p=448

http://ava.upuaut.net/?page_id=199 (hardware)

http://ukhas.org.uk/general:ukhasbadgeboard

http://www.randomengineering.co.uk/  (balloons)

http://ukhas.org.uk/projects:microballoons:faq

http://ukhas.org.uk/guides:filling_foil_balloons

http://www.rtl-sdr.com/hackrf-decoding-pico-high-altitude-balloons-hab/

http://ukhas.org.uk/projects:splat

http://ukhas.org.uk/general:beginners_guide_to_high_altitude_ballooning  (general info - not pico specific)