Getting around CORS with Node.js

I have recently wrote a JavaScript web application that depended on AJAX and a Java driven back end. The back end was a standard  RESTful Web service running an a Glassfish server. Before I started working on the application a designer was hired to make the HTML/CSS layout that was desired. As I got further into the development process and the moving parts started to change the CSS also needed to change. This proposed a problem for us mainly because all of the data ( and the html elements needed to display it) that were needed to populate each page had to be requested from the Web service. Typically what I would do is log in to the web application running on the test server fired up firebug and use the HTML/CSS edit feature to get the CSS to where I needed it to be. Alternatively, I would save the website and open the HTML and CSS pages locally and edit from there. Any CSS changes would then get committed to the repo. However, this wasn’t a good enough way for the designer. The designer wanted to put the repo on their localhost and simply run the application on their local server. This of course caused a CORS error because an XHR request could not be made from localhost to a remote server.

To solve this problem the designer could have got a local copy of the back end working on his localhost as well. However that would involve mysql, eclipse and glassfish to be configured correctly. Not really ideal. So I went out in search for another solution. After talking to some people I decided to use node.js. Havent yet worked with node I wasn’t quite sure of what I was going to do. Some people said i needed a Proxy some said i needed Middleware but at the end of it all I just needed a simple server and a client.

The Structure

The idea is for a server to listen to the requests coming in from your localhost on a particular port. This means that the URL of the XHR requests has to be changed to localhost:portnumber. Once the request is captured, a dummy client will make the same exact request but instead of the client being your localhost, it will be a node client whose domain is the same as the domain of the back end. Create a client using the domain of the back end. I never really found any documentation of what the createClient function accepted so let me show you what I used:

var aclient = http.createClient(80, 'backendurl.com');

The next step is to create a server that listens on a particular port. This is the same port i mentioned above. Ensure that this port is not being used by another application running on your computer otherwise you will get a weird port not available exception. Now, in the function that you pass in when you create the server you need to capture the request, get the needed data and make a similar request with a new url. You also need to figure out what the request method is. This is important because sometimes there will be an OPTIONS method sent which is basically a way of the browser testing if CORS is allowed.

if (req.method === 'OPTIONS') {
	// add needed headers
	var headers = {};
	headers["Access-Control-Allow-Origin"] = "*";
	headers["Access-Control-Allow-Methods"] = "POST, GET, PUT, DELETE, OPTIONS";
	headers["Access-Control-Allow-Credentials"] = true;
	headers["Access-Control-Max-Age"] = '86400'; // 24 hours
	headers["Access-Control-Allow-Headers"] = "X-Requested-With, Access-Control-Allow-Origin, X-HTTP-Method-Override, Content-Type, Authorization, Accept";
	// respond to the request
	res.writeHead(200, headers);
	res.end();
} else if (req.method === 'GET') { // no data is coming
	// use the client you created to make a request, this request will basically
	// need all of the information captured in this GET request  coming from your localhost:portnumber
	var clientrequest = aclient.request(req.method, '/api' + req.url, {
		'host': 'backendurl.com',
		'authorization': req.headers['authorization'],
		'content-type': 'application/json',
		'connection': 'keep-alive',
	});
	clientrequest.end();
	var msg = "", clietheaders;
	// get the response from the back-end
	clientrequest.on('response', function (clientresponse) {
		clientheaders = clientresponse.headers;
			clientresponse.on('data', function (chunk) {
			msg += chunk;
		});
	});
	setTimeout(function () {
		// send the data you just received from the back end back to you
		// client application on localhost
		res.writeHead(200, clientheaders);
		res.write(msg);
		res.end();
	}, 500); // wait a bit just in case we don't have all of the chunks of data
}
This is a simple implementation and it works great. It might not be the perfect solution but it gets the job done. Feel free to contact me if you need to implement this type of solution. Also if you need to go ahead and join the node.js irc channel: #node.js on the irc.freenode.net server. The people there are really helpful and forgiving.
Advertisements

Buttercamp – New York

I just got back from New York City and I am happy to announce that Buttercamp was a success! Buttercamp took place at the  ITP labs of New York’s Tisch School of the Arts. It was a hack session sponsored by the WebMadeMovies project. The idea behind the hack session was simple – Make cool HTML5 demos using popcorn.js and butter.js (Butterapp – The Popcorn.js Authoring Tool)  and any other tool you find. In preparation for the day Brett Gaylor and Ben Moskowitz (who did an awesome job organizing btw) reached out to artists, filmmakers and designers. Ben also had some of his students attend. Anyone interested in participating simply had to fill out a form proposing their idea or project. The requirements were simple, you had to have an HTML5 video, a story to tell, and a developer who knew their way around the web. Each project group was assigned a popcorn.js and butter.js expert to help with the JavaScript part of the demo. The process was flawless. The filmmaker/artist explained their idea and their vision and started annotating their video. The team developer worked on the look and feel. The popcorn.js/butter.js expert started on the functionality. Watch the Video Blog.

The Groups

#18daysinegypt

You can read more about the project on their website. The inspiration for their buttercamp demo came from the current conflict in Egypt. The idea was to produce a non linear timeline. The main video was positioned to take over the entire screen.  The video was of a protest happening on a bridge. As the video played information about the location appeared on the screen like Wikipedia articles, close-up photos of the protesters, and even videos of protesters being interviewed. The main challenge for this demo was getting content that was related. Who was on this bridge tweeting posting photos to Flickr etc. as the protest was happening. The main video showed one angle of the protest but the extra data formed a bigger however an incomplete picture. The question remains: how does one go about getting the whole story from every angle. Demo links here!

through a lens darkly

You can read more about the project on their website. The teams wanted to showcase the work of Sylvia Isabe using butter.js. Since this project has a lot of material, it is fair to say that they wanted to get a deeper understanding of how the tools work so that they can use their knowledge and apply it to future work. The team’s tinkering led to an improvement made to the butter.js tool. An import/export tool! The feature is still in review but the idea is to be able to import work previous done using the tool in order to make changes and add content. Demo links coming soon!

everything is a remix

This project aims at revealing how a particular video came to be. Which resources were used in its making and how the content was “remixed”.  This was more of a proof of concept than an actual demo request. Kirby Ferguson and I worked on this. Kirby wanted to explore an interface that  jumped down rabbit holes for more stuff to watch/learn. Kind of like Jonathan’s Donald Duck demo however instead of having information around the video provide the ability for the user to see the original clip in a clear way.  Kirby came up with a simple wire-frame:

When the main video came to a point where an original source video was available a button would appear. In this case we had two source videos. When the user clicked the button a new video would open up on top of the original with extra content (in this case an amazon link). As a result of this we realized that we need a video plugin in popcorn.js which i took some time at the beginning of the day to develop. It is currently making its way into the review process.  The main challenge of this demo was CSS. The position of the second video on top of the video proved to be oddly challenging and at the end just did not work. Temporarily you can view the demo here. The idea that Kirby wanted to explore is possible however it really need a designer to make it work.

robots

The idea of the buttercamp demo was to provide a non-linear type of story telling. The user made their own experience by choosing a path to explore. Bobby, a new member of the webmademovies/Mozilla team did an awesome job fine tuning the demo. View it here (Firefox only for now).

Graffiti Markup Language (GML)

The GML project has been around for some time. You can read about it on their website. The video aimed at connecting graffiti with video. A GML popcorn.js plugin is already in the works. The demo can be viewed here (it is most likely getting tweaked as you read this) a similar but different demo can be viewed here.

Tubeyloops

Tubeyloops’ focus was actually remixing video. You can read the project proposal.  Greg Dorsainville had a vision of having multiple videos and allowing the user to remix them on the fly in order to produce a finished product. What ended up happening here was AWESOME and hopefully in time I can link you to a blog explaining more. Data from pattern sketch was used to alter the video’s audio and to produce sequences of the final product. How it worked: There were four video clips each linked to a button on the keyboard (QWER). When one of the buttons were pressed the corresponding video played until another button was pressed. A real unique remix was formed each time the demo was used.

Individuals

There were a lot of people there, including Ben’s students, that wanted to learn more about HTML5 video and popcorn.js. We had about an hour dedicated to them in order to provide an overview of what HTML5 was and what you can do with it and just an overall tutorial on using popcorn.js and butter.js. As a result of this group using butter.js a number of bugs have been filled in order to improve the Butterapp – The Popcorn.js Authoring Tool

Lessons Learned

The day went great, participation was through the roof, and the demos were mind-blowing. However, as with most things in this world, Buttercamp can use some improvements.

  • The day was way too long. We started at 9:30 am and finished at 10pm. I would say we started seeing people leaving around 4pm. A little less than half of the people stayed for the show and tell at the end.
  • More designers were needed. A lot of the demos were centered around the design. I already talked about CSS being the only thing blocking my demo from doing what it is supposed to do. On days like this design experts are needed to fine tune the demo once all of the content has been collected.
  • A server to host all of the demos. It would be nice to allow people to ftp their demos as they were working on them. It definitely would have made this blog better, but it would also eliminate the time it will now take to get all of the demos from each team.

Photos

Conclusion

Buttercamp was fantastic. If you missed it maybe you can start a petition to get Buttercamp going in your town. The day went smoothly and encountered no real problems. Everyone had a great time collaborating and sharing ideas. If you attended buttercamp please share your stories, pictures, and results.

View all of my blogs on popcorn-js
View all of my blogs

Code Review, SR+ … but why?

I wanted to take some time and talk about code review. Let me start of by explaining what “code review” is, or rather what it is in reference to this blog. Code review is the act of looking at someone’s code in order to evaluate it.  Code that is in review is often referred to as a patch. The purpose of the code is to fix a bug, add functionality, or improve performance. Once the patch passes review it is then staged/added to the core of the project. The reason behind the review is simple. Does the code do what it says it is supposed to do? It is important to note that every project has review requirements.  For example, the popcorn-js project I am working on has the following requirements:

  1. Ensure the code follows the style guide and is appropriate
  2. Ensure the code passes lint
  3. Ensure main tests pass on multiple browsers: test/index.html
  4. Ensure new tests were added to test the new functionality and that they pass
  5. Ensure that other tests such as parser or plugin tests that are affected by the new code also pass

Looking at these requirements a patch for the popcorn-js project has to: fix/add the functionality it is meant to fix/add, it has to include tests, and it also has to follow a style guide. If the specific patch that you are looking at is missing any of these the review simply fails. However, what happens when it passes? Lately I have been seeing short and sweet review comments: “Super Review (SR) + ” . But what does this mean exactly? Did you follow the review requirements? Do you even know they exist? When a code patch fails review the reviewer always states the reason for the failure. This is obvious since the problem has to be outlined before it can be fixed. Is it too much to expect the same type of courtesy for a passing review? After all, the way a patch was tested is significant. I am not saying that the person reviewing the patch is not to be trusted. I am however pointing out that there is merit behind doing reviews. However, if a review is not properly documented it will be unofficially re-reviewed by the person who is responsible for staging the patch. Why? Simply because the person staging/adding the new code wants to ensure that nothing broke in the process. I am aware that the person staging usually checks to ensure noting is broken but there is a major difference here. For example, looking back at  the popcorn-js project and it’s review process requirements you will notice that the project has core unit tests as well as other parser and plugin tests. Typically after something has been staged the core unit tests, including any main demos, would be run. The plugin and parser test however would not. From a release engineer’s perspective proper review documentation saves a lot of time. Let me provide an example of good review documentation based on popcorn-js’ requirements:

SR+, code looks good

No lint errors

Unit tests passing on (Vista) Firefox, Chrome and Safari

This patch affects the googleMap plugin. I verified that all unit tests/demos using this plugin work as expected on the browsers mentioned above.

Notice that I am not writing a whole paragraph. Point form notes is all you really need to let the appropriate people know what you did and why the review had passed.  I hope you keep this in mind when doing a review.

 

View all of my blogs on popcorn-js
View all of my blogs

Building on Fedora 13

Prerequisites:

Run this command in terminal note that you need to do this as root. To switch to the root execute the command: su root

There is going to be some dialog so pay attention and answer y to everything

sudo yum groupinstall 'Development Tools' 'Development Libraries' 'GNOME Software Development'
sudo yum install mercurial autoconf213 glibc-static libstdc++-static yasm wireless-tools-devel mesa-libGL-devel

Building

1. Get the source code

hg clone http://hg.mozilla.org/mozilla-central/ src
cd src

2. Make a .mozconfig file and put it into the directory (src) you made above. The file should contain:

. $topsrcdir/browser/config/mozconfig
mk_add_options MOZ_OBJDIR=@TOPSRCDIR@/objdir-ff-debug
mk_add_options MOZ_MAKE_FLAGS="-j2"
ac_add_options --enable-debug
ac_add_options --disable-optimize

Note these options are for a debug-build

3. Run the make file

make -f client.mk

4. Run firefox by going into the src/objdir-ff-debug/dist/bin directory and either double clicking on the icon or via the command ./firefox

 

View all of my blogs

View all of my blogs on Building

Seneca at polytechnicscanada.ca

Polytechnics Canada is a national alliance made up of nine institutions dedicated to helping colleges and industry create high-quality jobs for the future. The members include Seneca College, Humber College, Sheridan College, OLDS College, George Brown College, Conestoga College British Columbia Institute of Technology, Algonquin College, and SAIT Polytechnic. The idea here is to work closely with companies to promote innovation and practical applied research.  Being a graduate, and now working for Seneca College, I have the opportunity of working with industry partners. Seneca’s Centre for Development of Open Technology is currently working with Mozilla, NexJ, Aerius 3D and Fedora. Out of all the research projects at Seneca, popcornjs was chosen to be presented at the 2010 Polytechnics Canada meetup. Each member was to choose one project that would be presented by a student during a 5 min Power Point presentation. Yes Power Point. 5 min 5 slides. Not sure how that promotes innovation but I guess it allows for a controlled environment with flawless transition from one presentation to the next. Being part of the popcornjs team I volunteered to present. For those of you who do not know what popcornjs is you have to read my blogs more often. I joke. Popcorn.js is a JavaScript library aimed at allowing non-technical people to manipulate open video on the web. It provides a way for filmmakers to control the environment in which their video is viewed by integrating semantic content (wiki articles, google maps, google news, webpages, twitter, flickr) with HTML5 video. I created a demo specifically aimed at explaining what popcorn can do for this presentation. View it here.  Some of the other student presentations consisted of new concept wind mills, water oil filtration units, and high tech. lenses.

After the 5min presentation came the “elevator pitch” to the Minister of Stats for Science and Technology Gary Goodyear. We were suppose to pitch our projects. I was prepared, at least I thought I was.  I had it all figured out. I was going to start with “Imagine CP24 on the web” a simple statement that I thought would get his attention. After all I heard a lot of people refer to CP24 while explaining popcorn it really is a close comparison. However, during dinner, i was sitting beside a member of NSERC and as soon as he realized that I was to do a pitch he said “pitch me”. So I went for it and got blown out of the water. Whats CP24? he asked. He continued to state that he and the Minister would be interested in knowing how popcornjs impacts Canada’s economic growth.  Well? How doest it? I mumbled something about revolutionizing the web and bringing Canadian developers to the top of Internet technology. But really popcornjs in an Open Source project that will not make or save a particular company millions of dollars. However, it has the potential of blowing up. I mean popcornjs was already featured in Wired’s webmonkey, and CHIP an online magazine. But how does one put a price tag on something that is free to use? Anyone?  Please comment if you have any ideas as I am sure this question will appear again.

 

View all of my blogs on popcorn-js
View all of my blogs

Seneca at technicity.ca

On Nov 30, 2010 Seneca was invited to showcase their IT work at technicity.ca. The Technicity event’s goals were to acknowledge the talent we have in Toronto and bring forth a plan of action to foster this talent. David Crow talked about how people tend to go to California or Boston to look for talent even though Toronto has some of the best schools.  Seneca was fortunate enough to get invited to the event. I along with Scott Downe and Andor Salga got a change to showcase our work including processing-js, popcorn-js, pointstream, and c3dl. I must say that the crowd was amazing. We got a lot of positive feedback including “this is a jewl… don’t give it away, sell it” — about popcorn. Usually we are faced with people that do not have a good understanding of technology and it’s capabilities. However at technicity we were faced with people that did not understand why or maybe how people can make money doing open-source development.

 

View all of my blogs

Processing-js 1.0v Release

It has been a while since I have written anything about processing-js. This is not to say that nothing has been happening. In fact the processing team and I have been working hard for the past couple of months to land the 1.0 version release. In this release we focused not only on adding functionality but also documentation. We were getting a lot of comments about the lack of documentation so we wanted to ensure that this stops.  After all everyone expects a 1.0 release to be stable and usable by everyone and not only experts. We have also spiced up the website! It now showcases a sketch on the homepage, a couple more exhibition examples, a fully updated reference page, and a better learning page.  If you are curious about the features we added read the change-log.

You have your own processing sketches?

As I mentioned we are featuring a processing-js sketch on the home page. If you have a sketch that you want people to see, or if you have a project that should be on the exhibition page please let us know. How? You can always file a ticket on lighthouse, comment on this post, or tweet about it adding @annasob to get my attention.

Need help with processing-js?

If you don’t feel like you have a grasp of processing-js or your sketches are simply not working feel free to join irc (just enter a nick and press go) or the processing-js Google discussion group.

Feel free to read our Quick Start guides for JavaScript developers or Processing developers they should help you out a lot.

Mike “Pomax” Kamermans has also written a tone to get people started, check out his guide.

We’re not done yet!

Just because the 1.0 is done does not mean that the project is complete. Our large community is always finding ways to optimize the code. For a look at whats coming in 1.1 release see the lighthouse milestone.

 

 

View all of my blogs on Processing.js
View all of my blogs