Showing posts with label web. Show all posts
Showing posts with label web. Show all posts

Friday, August 28, 2009

Premium Plaxo for Comcast users …

Comcast logo I recently switched from DSL (which I'd had since it first was invented) to Comcast Cable for my Internet connection (and TV and phone). By doing so I saved about a hundred bucks a month over AT&T and DirecTV. Of course as soon as I switched, AT&T started calling me with a bundle that was roughly the same price, but that's a different story.

One of the things that happened a while back was that Plaxo was bought by Comcast. I have always been a premium Plaxo user, feeling that I wanted to support them since I find the product so incredibly useful. What I learned was that if you are a Comcast subscriber, you are automatically a Plaxo premium user.

Now, being a premium subscriber used to only mean you got VIP support and access to a couple of tools (like the address and calendar deduplication tool). But now Plaxo has announced that the Outlook synch is a premium member only tool. While I worry that this decreases the value of the service (since there will be fewer reasons for people to sign up, therefore fewer members, and decreasing the number of automatic updates I get), what is interesting is that every Comcast subscriber gets access to these premium services.


Read more...

Wednesday, August 19, 2009

GTUG Campout - doin’ the Wave …

GTUG CampoutI recently attended the Google Technology User Group Campout at the Googleplex in Mountain View. This was a three day sprint to build something interesting with the latest Google product: Google Wave.

Google Wave, as it turns out is a very interesting experiment in social interaction. Google is trying to reinvent collaborative communication with a piece of software that is one part chat, one part Wiki, and one part WebEx.

I'd seen this product at the Google I/O conference a few months back and was impressed with the demos. Basically you get these shared documents (called Waves) that all of the collaborators can update at the same time. You can watch the hour and a half demo at http://www.youtube.com/watch?v=v_UyVmITiYQ

The demo included things like interaction with blogs, Twitter and other web technologies, as well as interesting programming doing things like on the fly grammar checking. I signed up for a sandbox account the day of the presentation (using my iPhone of course), and got set up a week or so after that.

Wave was written by the brothers Lars and Jans Rasmussen, who are the architects of the Google Maps API. In some sense, this is an experiment in building software caused by the lessons they learned with the immensely popular Maps API. By giving the developers access early in the build process, they hope to build a more solid platform that will serve the developers needs.


Read more...

Tuesday, August 11, 2009

Cloudy with a chance of Apps …

Since last week, I've been immersed in coding and development education about various cloud applications.
Google Wave

First there were a couple of meetups about the Google Wave product that gave me a overview of some of the capabilities and requirements for developing applications around the Wave product. Google Wave is an interesting piece of social media that is a bit like chat and MediaWiki combined with WebEx.

The first talk on Monday, was about the federation server, which is the open source implementation of Google Wave. The idea is that you could have a Wave server inside your firewall that could protect your data, while also allowing for communication and interaction with other federated servers. The code is so new, that it is actually using a different protocol than the Google Wave servers are using.
Read more...

Tuesday, July 21, 2009

Hey you get offa my cloud …

I've been using some of the more interesting "cloud" applications recently: Google Apps, Live Mesh and a few others.

I'm really impressed with the capablities and use of these free web applications. It's a really interesting marketing tool as well: give away the low end product to build user acceptance, and then add a bit more to give value to the enterprise.

My first foray into the personal cloud was Google docs. This product has to be the coolest idea ever: create your documents on a web site, and let them be shared and simultaneously editable. The concept is awesome, and works really well for some documents (most notably spreadsheets). I can share a spreadsheet with any number of people, and they can all edit it at the same time.


Read more...

Tuesday, May 5, 2009

Oracle and Sun - Better Microsoft competition ?

I was thinking about this as I drove to work this morning: what is the real business value to Oracle of buying Sun ?

It occurred to me that part of the many benefits to Oracle are the products that help them compete better with the Microsoft offerings. Could this be another in a long line of acquisitions by Larry Ellison in his quest to make Oracle a more successful company than Microsoft ?
Read more...

Wednesday, April 22, 2009

How to delete your Google account

If you're like me, you eventually end up with too many identities. I haven't figured out how to associate different email addresses with a single account, so I have to maintain several different Google identities.

For example a client invites me to join his Google group using his company email, so I have to set up a new account to access and manage that group. At some point I try to consolidate these to use my GMail identity, but that isn't always possible (the client may for instance have decided to restrict their group to only allow access by people using their corporate domain addresses for instance).

Eventually though, I do end up with accounts that I'm no longer using (and sometimes I just would like a fresh start). Google actually has a link in their help about deleting an account, but it took me a bit to find it, so I thought I'd post a step by step guide.
Read more...

Tuesday, April 21, 2009

Getting a Google login for your existing email address

Step 1 - Simulating an email alias using a Google Group


The last couple of days, I've been involved in setting up a new non-profit entity and email addresses for the people involved. I used my hosting service to create a new domain, and set up email addresses that forward to their existing email addresses.

What I realized was that I didn't have an easy way to set up a email aliases, and I needed a way to forward email to the entire group.

Naturally I thought of Google Groups, which lets you set up a sort of discussion board and file sharing area. My team has been using groups for other purposes, and one of the things we learned was that if you keep the group private, it can become confusing as people add their email aliases to the member list so that they can post from their various accounts.

So for this new group, I decided to set it up to only use the email addresses from the new entity, and allow posting to the group by any email address. By doing this, anybody can send an email to groupname@googlegroups.com and it will get sent to the group, effectively creating an alias.



This also keeps the group relatively clean, since the only addresses that show up in the members list are the ones that are connected to the new domain. On other groups I've managed, I end up with people with lots of different entries because they have multiple email addresses which can be confusing.

Without signing in, the group can now send email to "all" with a single email address. Of course that doesn't take advantage of the file sharing and other group features, so what follows are instructions on how to create a login for a new email address which will allow logging into the group.
Read more...

Sunday, March 15, 2009

Stupid Firebug tricks …

Recently I've been doing some web page work again, and trying to push the envelope on my CSS knowledge.

I was at a talk at 360|iDev where Brian Fling was talking about UI design, and he pointed out that your site should work without your CSS. Now while I know this as a design principle, it seems like something that we probably overlook more often than not.

So, out of curiosity, I thought I'd take a look and see what some of my pages look like without their stylesheets applied.


Read more...

Sunday, February 15, 2009

Share A Calendar with a Group …

One of the things I always struggle with as a project manager is how to communicate availability. If I'm using a corporate email system like Exchange, it is extremely easy to set up calendars to be shared, and everybody using that system has the ability to at least see your free/busy status which helps in setting up meetings.But when you're dealing with a disparate group, who don't have access to the same information, figuring out meetings can be difficult. I manage some of this complexity with tools like Plaxo and MobileMe, which allow you to keep calendars in synch across a variety of calendar systems, including Google, Yahoo and even the local calendar application.

This doesn't solve the problem of how to check on availability however. What I've always found most effective inside the corporate firewall is to make my calendar public, and to ask my team members to share their calendars as well. This allows me not only to quickly schedule meetings, but gives me insight into what sorts of meetings my team is scheduling, and how they are managing their time.
Read more...

Monday, January 26, 2009

Google Translate me please …

I use Google Reader to follow industry blogs about things like PHP and Java. One of the nice things that Google Reader does, is to automagically translate the page into English when the post is in a different language.

This is very helpful especially with blogs in subjects like these, especially since the international community is very active. Reader will give you a brief translated version of the feed, and when you click the link to go to the page, it typically forwards you through http://translate.google.com so you can read the page. For the most part, this yields a very understandable page that represents the subject the author was trying to convey very well.


Read more...

Friday, December 26, 2008

EJB or not JB? That is the question (sort of) …

I recently read a post on LinkedIn on the WAFUG group by Andrew Hedges:
Frameworks or libraries?

"Frameworks are larger abstractions than libraries. Abstractions leak, cost performance and take up mental resources." ... http://tr.im/20fm


Is the whole framework craze overkill for most projects? At what point does it make sense to use a framework over libraries that just do what you minimally need? Is it better to start with a large, leaky abstraction or only impose it if/when the project gets big enough to need it?

The answer to this for me is "it depends", and it's what keeps architects up at night. Frameworks are usually attempts to encapsulate some best practice or design patterns in a way that will help the developer achieve some gain in productivity, scalability or reliability.

During the dotBomb era, I worked for a small boutique consulting company designing web sites for a lot of startups. All of them were convinced they would be the next big thing, and most of them had extremely aggressive targets for scalability.

Because of this, we spent quite a bit of time with various frameworks trying to understand the balance between the complexity introduced and the scalability that the framework would give us. At the time, Sun was busy pushing the EJB framework, which was a study in how to over-engineer software. The big benefit promised by EJB was that you could theoretically scale your application to infinity and beyond. The downside was that it basically took a huge team of rocket scientists to get off the ground (this was not the nice simple POJO based EJB of today).

What we found was that in most cases, we could get the same sort of scalability for our clients out of simple Model 2 JSP based applications (MVC approach) by taking care to design the application from the database forward. By using the database for what it is really good at (caching, storing and reading data), building DAOs to interact with the database, and factoring the business logic out of the JSP's, we were able to build a reliable MVC framework that we used with many clients. The framework was very similar to Struts, which didn't yet exist, and which we started using once Struts2 was released.

Turns out that the amount of traffic that you have to experience to require the overhead of a complex framework like EJBs is not realistic for any but a handful of web sites.

Fundamentally as an architect, it's my job to figure out what the real problem is, to solve that in a way that will answer the business need (immediately),  and to build it in a way that will allow for some level of deviation from today's needs (future scalability and flexibility). So for almost every case that we came across, there was too much complexity and overhead in the EJB framework to make adoption a valid choice back then. Not only did an EJB design require significantly more developer work than the alternatives (making it more costly and harder to change), the initial application wouldn't perform as well (since it was burdened with all the code that was required to make it scale seamlessly).

All of that said, EJB is also a great study in how a framework can be improved. With the current value proposition of EJB3, the objections that were so clear before have gone away: it no longer takes a rocket scientist to engineer EJBs, and in fact any complexity is fairly well hidden from the developer. Most of the overhead of the framework has been moved into the run-time, so it scales much more appropriately.

As an architect, my decision becomes much easier to include a framework like EJBs when it both saves development time, and gives me a clean and simple path to change. There's always a balancing act between the time to market, and anticipated change in the application too. I worked on some applications that used MS Access or Domino to get off the ground quickly because when used as a framework those applications are great RAD tools. You can prototype and get to the 95% level of usability for an application very quickly, and for many apps this is better than good enough.

The problem with these (as with almost any framework) is when you get to the point you need to do something that the framework wasn't designed for. You built a killer app in Access, and now you want to roll it out to the entire company. Well, even though Access claims to work for multiple users, turns out it is a pain, and uses some very archaic methods for things like sharing the database. And even if you reengineer it to store your data in an enterprise DB, it still has the problem of needing to be deployed somewhere or being shared (which again runs you into the locking and corruption problems).

Every problem requires thought and some knowledge of the limitations of the tools in your tool box. By understanding the business problem, and coming to a reasonable understanding of what sorts of changes you can anticipate (including unanticipated ones), you can choose the right level of complexity to solve the problem for today and tomorrow.
Read more...

Wednesday, December 24, 2008

Content that’s too dynamic …

Recently I've been noticing that ad content is being served up much more dynamically than I'd expect. When I'm looking at the menu on TiVO, or surfing Facebook, there are always little ads displayed that don't immediately catch my attention. In fact most of the time, the ad doesn't even register until I've clicked something and am waiting for the next page to load.


Read more...

Saturday, December 20, 2008

Moving my VolunteerCake to my Mac …

I have been running my development for VolunteerCake with a database on my Windows box which sits in my office with my Mac. I went to meet some people at a coffee shop, and realized that I couldn't show them the app running on my MacBook because I was no longer on the same subnet with my Windows box, so I decided to move the database to the Mac to allow me for this.

Since I had everything in place to run Cake on my Mac except for MySQL, the first step was to install MySQL. This turns out to be pretty painless. Just grab the DMG from the MySQL site, and voila, new MySQL running on my Mac. Checked everything out using the MySQL administration tools, and it all looks good (I can access the DB, set up users, etc.)

Next I need to put the data in the database, so I just do a quick export from the phpmyadmin page on the PC. I end up with a file that is the SQL needed to replicate the entire database on my Mac. I run this SQL into the Mac MySQL, and now I have an exact copy of the database on my Mac.

After that, I go to the SQL administrator tool and make sure I have the user set up to give access to that database and make sure the username and password are the same as I was using on the PC (if I were more of a DBA, I'd probably have done this with command line MySQL, but I like GUIs, especially for things I don't do every day, and the MySQL tools are pretty cool).

Then I need to change my database.php to point at the local database in order for VolunteerCake to get the data from the Mac. This should be as easy as changing the name of the host name to 'localhost' from 'monet' since I've set up the user and access to the database exactly the same as what I had on the PC.

Finally, all that's left is to fire up the same URL that I have established for my app on my Mac (http://test.lctd.org/VolunteerCake) and ... wait, that didn't work. It says it can't find the table acos for the Aco model ...

Weird, the table is there, I can connect just fine, what could this be? A quick trip to the IRC channel, and I get the suggestion of clearing my cache. OK, try that ... But hit the URL again and no change.

OK, now I'm confused, so I try running 'cake bake' ... Now something interesting: I get an error that tells me that it was unable to connect to /var/mysql/mysql.sock - what does that mean? I thought I was connecting with a TCP socket, why does it want a file? Is this some sort of file permissions issue ?

Back to the IRC chat for some guidance, thinking maybe it's a common problem, a permissions issue or something, of course they tell me to do exactly what I'd tell somebody else to do: verify that you can connect from PHP first. Good idea - so I whip up a quick connection test page, and get the same error. So now I've confirmed that it's a PHP problem, and not a Cake issue.PHP can connect to a remote DB, but not the one on my local Mac ...

Now it occurs to me that I often have problems that end up being related to the open source software that came bundled with the Mac, so I do some Google searches on PHP connection to MySQL for Mac OS X, and with the connection error messages. Eventually I find what looks to be the issue: for some reason the MySQL configuration sets the socket file to /tmp/mysql.sock but the PHP that comes with the Mac is looking somewhere else (at /var/mysql/mysql.sock to be specific). So I basically have three choices, edit the php.ini, edit the mysql config file, or build symlinks to make the file accessible at both locations.

I decide to change the php.ini file, which turns out to be another excercise in hunting, since Mac OS X likes to hide the files you'd expect to find in the /etc directory. After some more Google searches, I find that the PHP5 install that comes with Leopard puts the php.ini file into /private/etc, so I edit that file, changing the part of the file that looks like the following:
; Default socket name for local MySQL connects. If empty, uses the built-in
; MySQL defaults.
mysql.default_socket =

To be:
; Default socket name for local MySQL connects. If empty, uses the built-in
; MySQL defaults.
mysql.default_socket = /tmp/mysql.sock

In order to have PHP find the mysql.sock in the location that MySQL is actually creating it. Check my URL again, and voila, everything is working !!!

So, to make a long story even longer, I relearned that mixing actual open source with vendor open source is often problematic. It was suggested by at least one person (Mark Story) on the IRC channel that the best way to set up for Cake development on the Mac is to use MacPorts, since then you end up with matching versions of the software all in a "normal" open source location.
Read more...

Monday, December 8, 2008

CakePHP and RESTful Web Services - debug problems

I ran into an odd problem with the way Cake is coded that tripped me up for a couple of days. Because I hate it when things don't work the way I think they should, I spent way more time debugging this than anybody should.

I got my basic RESTful service working for the VolunteerCake project, and everything was working swimmingly, until I needed to turn on debug to figure something out ...

When I had the debug level set to less than two (2) calling the action I was interested in with an extension of ".xml" was working fine. I got back the XML representation of the data in the action I was interested in returned with content-type of "application/xml". In Cake, if you turn debug to 2 (or 3) it will dump out the SQL that was run in an HTML table.

The problem is that this HTML table is actually spit out  after the rest of the view, meaning that my RESTful service no longer has a well formed document to display. Additionally (for reasons I've yet to isolate), when this happens, the document is returned with a content-type of "text/html" instead of "application/xml" as expected. Neither of these things would be acceptable if the site is to provide web services, since it would mean the web services would be broken as soon as somebody needed to debug.

The workaround for this is to manually reset the debug level when the extension of "xml" is detected. Since the debug data is useful, and it's just the SQL that appears to break the XML, I asked on the IRC channel what the last place I could set the debug might be. The suggestion was to put it either in the afterFilter, or the end of the view itself.

I found that if I put the following code into the beforeFilter method, I could prevent the problem with the price of losing my debug output:
if ($this->params['url']['ext'] == 'xml'){
Configure::write('debug',1);
$this->RequestHandler->respondAs('xml');
}

That same code placed in the afterFilter method gave me the debug output in a well formed XML document (excluding the SQL table), as did placing it in the view itself. This leads me to believe that when debug > 1 there is some code that happens after the beforeFilter that is not setting the content type to "application/xml" as would be expected from our routing rules.

Being the bulldog that I am, I dug into the Cake source code to see if I could figure this out. I found the spot where the SQL table was being built, which turned out to be in the showLog() method of the dbo_source.php, which is called by the close() method. Since the close() is called after the view is finished, and the showLog() method simply prints the data, that explains why it breaks the XML. It definitely breaks the MVC encapsulation, since the data gets dumped into an HTML table and spit out after the view is complete.

On the IRC channel, it was suggested that I try creating a data source that would override the showLog method and spit that table out to a table, which might be worth trying.

I posted my question on the CakePHP Google Group and got the useful suggestion to use the FirePHP plugin which basically writes the log data to the FirePHP plugin so it can be displayed in FireBug. So my approach will be to write a dbo_mysql_firephp.php class that does just that. This will at least resolve the MVC encapsulation issue and keep my view relatively clean.

I still want to figure out exactly why the content-type isn't getting set properly, but for now I have a workaround that I'll use, and I'll add the FirePHP debugging to solve the well-formed XML issue if I ever do figure out the content-type problem.

Off to set up my FirePHP plugin and build the dbo class now ...
Read more...

Thursday, December 4, 2008

CakePHP and RESTful Web Services …

I'm on a quest to make my application provide RESTful web services. After much digging, I found a post by Chris Hartjes at http://www.littlehart.net/atthekeyboard/2007/03/13/how-easy-are-web-services-in-cakephp-12-really-easy/ that helped a lot.

Turns out that Cake has some really nifty built in support that can be turned on really easily. For basic XML support, all you need to do is to add a couple of lines to your routes.php file to allow Cake to handle XML. This is pretty well described in the Cookbook at http://book.cakephp.org/view/477/The-Simple-Setup

So for my VolunteerCake project I added the following lines to my routes.php:

/**
* Add in support for web services by enabling generating output based on extension
*/

Router::mapResources(array('events', 'event_josb', 'groups','jobs', 'slots', 'users', 'user_groups','user_slots'));
Router::parseExtensions();

The mapResources() does the magic that maps the REST requests to the actions in the controllers, and the parseExtensions() sets up Cake to do some other routing magic when the request has a ".xml" extension.

So now if I call any of my actions and append ".xml", Cake changes the response type and view to return XML. Next we need to add the view for the XML, which goes in the xml directory under the view we are REST enabling (e.g.- for jobs, we have a views/jobs/xml directory where the view CTP files need to be placed).

First I created the xml directory under the views/jobs folder, and next I created an index.ctp. This is a very simple file, which Cake's XML helper to spit out the data with the following code:
<Jobs>
<?php echo $xml->serialize($jobs); ?>
</Jobs>

Now to get the XML to display, all I have to do is create the appropriate views for my REST actions.

So for example if I go to the app/jobs action, I would normally see the XHTML representation of the page like:

Jobs XHTML screenshot

Then if I append ".xml" to that same URL, I get the XML back as shown in the following screen shot:

Screen shot of the Jobs XML in browser

Next we need to do the view.ctp to provide support for sending back the data for a specific job by ID. This is practically identical to the index.ctp, except we've modified the code to use the variable $job instead of $jobs (since that's what Cake returns):
<Jobs>
<?php echo $xml->serialize($job); ?>
</Jobs>

This allows us to get the behavior of being able to get the XHTML for a specific job by using a url like /jobs/view/1 as shown:
Screen shot of jobs/view/1

Then by appending ".xml" to that same URL, we get the XML for the job with ID of 1:

Screen shot of jobs/view/1.xml

You may notice that the XML for this Job has a lot more data than we saw when we got the list for the specific Job. The XML from /jobs.xml is only one level deep, while the data from /jobs/view/1.xml has a hierarchy of a job having slots, which in turn has a job, user_slot and user.

That happened because the index action was set up to only get the data from the Jobs, while the view action had recursion set in order to gather all the related data. By setting the recursive var to 0 (zero) in the index action, we get no children, while in the view action we set the value to 2 (two) which tells cake to fetch all the HABTM data (see: http://book.cakephp.org/view/439/recursive for more on this). Alternatively we could do a specific find and modify which bits of data we populate in the controller to determine what data gets spit out in the XML (this would alleviate the one potential downside to this approach which is that ALL of the data fields and tables are currently being placed out in the XML stream).

The basic point here is that we now have a working RESTful service (at least as far as fetching data) that doesn't require a great deal of specific view code.

Next: completing the RESTful CRUD ...
Read more...

Shift my economic paradigm

I was sitting in an interesting presentation tonight that was about managing your career called "8 Essential Levers for Job (Search) Success" by Chani Pangali, and as part of his talk he mentioned the pardigm shift that is going on with how careers need to be managed.As we moved from small villages to an industrial society, we evolved from a barter economy, where you traded what you do for what you need, to a market economy that was based on doing work that supported the industry.  To me it seems that this resulted in huge shift where many relationships were replaced by intermediaries.


Read more...

Thursday, November 27, 2008

SourceForge.net application hosting cache issue

I have a new open source project at VolunteerCake that is using their recently released web hosting service. This service includes the typical LAMP stack with MySQL, Apache and PHP, so I thought it would be a great place to keep a demo of the site running.

It was working fine, and then one day I noticed that the pages were being over aggressively cached. For instance, if I clicked the login button on the front page, and logged in successfully, I expected to see a "logout" button and my user name, but instead was seeing the original page. By hitting "shift-refresh", I was able to get the right page to display, but obviously that wasn't a good way to demonstrate the software.


Read more...

Wednesday, November 26, 2008

Fun with HTML and CSS

I spent some time yesterday figuring out CSS problems for Job Connections.

The Job Connections site was built using a CSS for printing that wasn't including all of the parts of the page that should be printed. They use a stylesheet called print.css, and when somebody would try to print a page, they weren't getting anything but the text in the middle of the page.


Read more...

Friday, November 21, 2008

Plaxo: the service I love/hate

A couple of days back, I solved a problem I was having with Plaxo. For a few weeks, I was unable to connect to any of the Plaxo web servers from any of my home machines.


Read more...