Tuesday, April 12, 2011

VMWare Screen Repaint issues

I recently added an iMac with 8Gb Ram, and I use VMWare due to a need to run an oracle database and some of the Oracle utilities. To my horror, the repaint on the XP VM was marginal, and I often found myself doing double takes at the things I had written (more than the usual amount anyway).

I tried mucking with the 3D Acceleration on the VM Settings, but that didn't have any effect.

What did the trick was modifying the settings on the Windows VM - in this case XP.

Under Control Panel / Display Settings / Settings Tab

click the Advanced Button

Goto the Troubleshoot Tab

Adjust the Hardware acceleration down one notch.

Restart your VM.

No more double takes due to missing pixels.

Thursday, November 11, 2010

Trinidad Date Picker Issue with Daylight Savings Time

Trinidad Date Pickers in the site went haywire just after the DayLight Savings change on November 6. Users who selected a date prior to November 7 would find the input field populating with the previous day selected. The issue is in the javascript in trinidad-impl jar. We are using version 1.2.13. The issue occurred for servers running on the Unix platform and windows. The issue occurred for people using clients with Mac and PC.

I tried setting the time zone in trinidad-config.xml with a number of different options: UTC-8, GMT-8, and Pacific Daylight Time, but no luck.

So I opened the jar:

jar -xvf trinidad-impl.jar

In META-INF\adf\jsLibs the file DateField is the culprit. In the function

_getDayLightSavOffset(a0)

the function gets the client date, and compares the time zone offset to the server time zone offset. For dates after the time change, the offset is 0. For dates prior to the time zone change the offset is always 60. For our app, we always want the value the user selects in the popup to appear in the date field, so I just always return 0 here. I tested with the server date set to various times before and after the time change, and the fix works fine.

Make your change, jar.cf .jar *, and substitute your new trinidad jar.


There are posts related to this, and the problem seems to have been thought to have been fixed previously:

TRINIDAD-1349:_uixLocaleTZ stores the timezone offset of the server at
* the time the page was displayed (Current time), and currentDateTZOffset
* is the timezone offset of client at the current time as well. However,
* the timezone offsets for both client and server can differ for the
* date that was picked due to daylight savings rules. For example, the
* current time is 3 Dec 2008 and the server is in PST (UTC -8) and
* client is in Perth (AWDT, UTC + 9) so the difference is 17h. But if
* the user picks Apr 25, the server is actually in PDT then (UTC-7) and
* the client in AWST (UTC +8) so the difference is actually 15h. The original
* code would subtract 17h, which would cause the resulting date to move
* to the previous day. *

Thursday, May 20, 2010

Performance Improvements for DAO Testing in an ORM environment

In any good size project using an ORM tool such as Hibernate, you are bound to have a good number of DAO classes that encapsulate persistent interactions. Running JUnit tests against your DAO's can take some time with ANT, as a new Session Factory is instantiated for each Unit Test. In a current project the time to run the tests in our DAO package was about three minutes. I found a solution that cut the time to run all the tests down to 40 seconds. The solution : Junit TestsSuite.

The TestSuite instantiates the SessionFactory once, and all the tests reuse the instance. The code:

public class DaoTestSuite extends TestCase{

public static Test suite(){
TestSuite suite = new TestSuite();
suite.addTestSuite(DaoTest1.class);
suite.addTestSuite(DaoTest2.class);

// and so on ... or you could do this reflectively

suite.addTestSuite(DaoTestN.class);


TestSetup wrapper = new TestSetup(suite){
protected DbUtil dbUtil = new DbUtil();

@Override
protected void setUp(){
//Before the suite, we configure the factory
dbUtil.beginConversationThread();
}


@Override
protected void tearDown(){
//After the suite, we ensure the factory is closed
dbUtil.rollbackTransaction();
}
};

return wrapper;
}



The DbUtil class is a standard ORM class that configures the SessionFactory, ensures that a session is open, transaction is active, and provides a means to rollback the transaction.

The Test classes that are added to the suite have the following before and after methods:

protected DbUtil _dbUtil = new DbUtil();


public void setUp() {
_dbUtil.beginConversationThread();
}


public void tearDown() {
_dbUtil.rollbackTransaction();
}


Where the beginConversationThread() call grabs an active session from the factory, or starts the factory if it is not yet configured and running. Since the TestSuite is run in one JVM, the SessionFactory is configured but once.

Tuesday, December 15, 2009

Custom JSF Components with Facelets

The Overview
Creating custom components can really clean up your faces pages and keep you from repeating yourself over and over. And its fun to do! There are a number of posts out there and tutorials about creating custom components, and often times they leave out the most important pieces of configuration. In this post I will specify how to create composite components with Facelets. This is different from the jsf style custom components in that these are much easier to create, require much less configuration, and often times solve the problem at hand very quickly.

The Affected files
  1. the Web Descriptor
  2. The component definition
  3. The tag library descriptor
  4. The pages that you insert the component into


The Web descriptor, (web.xml) tells your container about your tag lib descriptor file.

The component definition fie specifies how the component is laid out. It may have logic in it and you may pass parameters to it. It produces the markup that appears in your faces page.

The Tag Library descriptor is an xml file that describes to the container what to do when it encounters your custom component tag.

Finally, your pages need to include the tag for it to be seen.

Getting Started
The first thing to do when creating a component is to generate the markup, styling and any scripts that accompany it in a sandbox. Try inserting it in your pages and seeing what it looks like and get it to work correctly. Once this is done, you are ready to generate the reusable tag that will display your component each time you need it. In this case I made a progress bar for my application.

Directory Layout
Under WEB-INF, create a directory called facelets, with a sub directory called tags.
In the tags directory we are going to place our tag library decriptor and our component.

myProject.taglib.xml
The Tag Lib Descriptor file: (myProject.taglib.xml)

<?xml version="1.0"?>
<!DOCTYPE facelet-taglib PUBLIC
"-//Sun Microsystems, Inc.//DTD Facelet Taglib 1.0//EN"
"facelet-taglib_1_0.dtd">
<facelet-taglib>
<namespace>http://www.yourId/jsf</namespace>
<tag>
<tag-name>progress</tag-name>
<source>progress.xhtml</source>
</tag>
</facelet-taglib>



The file specifies the following items:

namespace - Your unique id that wont conflict with other known urls. It does not need to exist on a server, it just needs to be unique. In this example I called it http://yourId/jsf. Any components that you place in this tag lib descriptor will be accessed in you pages using this as the uri to find this definition file. So in your pages that contain the component, the following tag will be added in your opening definitions:
xmlns:xx="http://www.yourId/jsf"
where xx will be the tag handle in the page.

tag-name is the name of this particular component. In this case it is a progress bar. In your page you will access it via xx:progress

source is the name of the file that we will create to define our tag - the markup.


web.xml
Now we need to tell the application about our library. In web.xml, create the following entry:

<context-param>
<param-name>facelets.LIBRARIES</param-name>
<param-value>/WEB-INF/facelets/tags/myProject.taglib.xml</param-value>
</context-param>

If you already have an entry like this, the param values can be a semi-colon delimited list.


progress.xhtml
So now we have our tag lib descriptor, and the container knows about it. Lets generate the component file. The name of the file must match the name in the taglib descriptor for the component. It is the source tag. So when the application encounters the tag name in the page, it looks up in the taglib descriptor (which it gets from the namespace definition) and finds the source file associated with the tag name.

This file also resides in WEB-IN/facelets/tags.

The file:



<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml"
xmlns:ui="http://java.sun.com/jsf/facelets"
xmlns:c="http://java.sun.com/jstl/core"
xmlns:fn="http://java.sun.com/jsp/jstl/functions">

<ui:composition>

<span class="progressContainer" >
<div class="progressBar" >

<c:if>
...more markup
</c:if>

</div>
</span>

</ui:composition>
</html>




I've left out my markup, but you can put yours in. We will look at parameter passing a bit later. For now, lets just get you using a tag in your pages. At this point everything is set to go. You just need to include the definition for your library in your page, and then specify the tag in the page.

So in your page:

In the opening tag, include the definition:

xmlns:xx="http://www.yourId/jsf"

Note that this must match the namespace specified in your tag lib descriptor. Remember that this is an arbitrary name for id purposes only. The xx is an arbitrary handle for accesing the library in your page.

Then, in your page to use the tag you would call:


<xx:progress/>



OK So at this point the tag should appear in your pages where you specify it. In the next edition, we will discuss parameter passing and using jstl in the tags. I also like to include certain tags in my templates that pages use. This requires a different approach for parameter passing, as the template defines where the tag will be, but the pages may want to use the tag differently.

Hope that helps.

Tuesday, October 13, 2009

Refreshing Data in the UI

Once your application is up and running smoothly, all the data appears as it should, it may seem that all there is left to do is style it up and move on. But in high volume sites, or low volume sites with areas of high transaction, you need to ensure the user is seeing the latest information available. This needs to be done via XMLHttpRequests so that the page that the user is working on remains intact.

Lets first discuss the data is stored and retrieved.

In any web application, the user will request data and you present it. Often the state of the page is determined by the data that is present. So you want to limit the number of calls to the database for this information by caching it in your backing bean. This requires that you reset this data when appropriate. Once the page is loaded, the data remains flat on the page until the user interacts, or you cause an update. Thus we load the data on demand, or lazily, and then cache it until it needs refreshing.

Refresh from the UI is made by AJAX calls. The calls invoke backing bean actions based on the framework you are using. Apache Trinidad has a nice built in mechanism for making AJAX calls to the JSF.

In my backing beans, I add a PollListener method that resets the data I want to refresh in the UI. The method looks like this:

public void tableDataPoll(PollEvent ev) {
resetData();
}


When the bean gets the call from the AJAX event, it resets the underlying data so that when the page requests the data, it is pulled afresh. In the UI the call is made by adding a trinidad poll ( An AJAX call that runs at a certain interval) to the page:

tr:poll interval="30000" pollListener="{bean.tableDataPoll}" id="ajax1"

This causes an AJAX call to invoke my poll listener in the backing bean every thirty seconds. In order for the table of data to update, you need to add a partial trigger to the component that displays the data. This is done as follows:

tr:table value="{bean.backingData}" partialTriggers="::ajax1"

So the poll initiates an ajax call to the bean, resets the backingData in the bean, and the table refreshes by updating itself with the new value for the backing data.

It is incredibly easy to do.

On another project where I did not use Trinidad, we achieved the same by writing out the ajax calls. Basically, the pages would call your ajax javascript library with the ids of the objects to update, and your backing bean would change these values. I found that an extremely useful tool was a window within the page that showed the ajax calls. In the page, I added a div element such as:

div id = "div_ajax_log"
textarea id = "log_ajax" name = "status" rows = "25" cols = "55"
/div

I would have a javascript library for ajax, and in this library I would have a toggle to turn ajax logging on and off. I would also set the style for the div as none or block based on wheterh I was in debug /dev mode or not. That way if there was an issue in production, I could inspect the messages coming to and from the server to the page.

In my js file for ajax, I would specify whether we are logging, and the element to log to :
// Whether to log
var _logging = false;
// Where to log
var _log_elem;

I would then specify a logging function:

function logger(text, clear) {
if(!_log_elem) {
_log_elem=document.getEleemntById("log_ajax");
if(clear)
_log_elem.value="";
var old = _log_elem.value;
_log_elem.value = text + ((old) ? "\r\n" : "") + old;
}

You will need to add in some null checks and/or try blocks in the above basic version.

Then, in my ajax methods I would check to see if the logger was running and if so spit out information to the text area. For outgoing calls:

logger("AJAX Request: " + ((async) ? "Async" : "Sync") + " " + method + ": URL: " + url + ", Data: " + data);

Where the variables were passed in by the calling page. For outgoing responses, I would do the same - here the variable AJAX is the XMLHttpRequest generated in the original call.

logger(AJAX.status);

logger(AJAX.responseText);

The result is a conditional logger to the page with the information carried to and from the page in AJAX calls.

Hope that helps.

Tuesday, July 14, 2009

Testing Web Applications

How many times have you been on a project and seen changes to code cause the application to blow up, and yet the Unit Tests all succeeded? To avoid this situation I use HtmlUnit, an open source project distributed under the Apache License.

HtmlUnit emulates a browser that is visiting your web application. Coupled with JUnit, it allows you to write tests that click through the application just as QA does. It ensures that the functionality you put into your pages exists and works properly.

With HtmlUnit and Junit I am able to test most functionality in the web tier : page flow, form population and submission, javascript, popups, dialogs, and AJAX calls. I even created a suite of convenience methods to work with my Trinidad components.

The one shortcoming has been the ability to test Flex animations. These were always a black box to my test suites. However, Gorilla Logic has now released FlexMonkey, which is also based on HtmlUnit, and allows you to test the flex animations from an easy to use test suite.

I use these tests as both integration and unit tests. I ensure the application can be "clicked through"; that all pages appear at the correct time, and that interactions in the page all function correctly. I also test various scenarios that will occur in the application, and ensure that the application responds correctly based on the processing of the data passed in.

I no longer worry about the application blowing up in meetings or demos:

IT'S BEEN TESTED!!

Hibernate Search by Date

On the project I am on now, we use Dates as a basis for monetary calculations. The Dates are simply based on the day, month and year, disregarding hours, minutes and seconds. We store our dates after base-lining them - stripping out the H:M:S. This is achieved with Calendar utilities:

static Date baseLineDate(Date myDate) {
GregorianCalendar cal = new GregorianCalendar();
cal.setTime(myDate);
cal.set(Calendar.HOUR, 0);
cal.set(Calendar.MINUTE, 0);
cal.set(Calendar.SECOND, 0);
cal.set(Calendar.HOUR_OF_DAY, 0);
cal.set(Calendar.MILLISECOND, 0);
return cal.getTime();
}

Determining if a Date is before, after, or equal to another is straightforward thereafter- just use the Calendar before() and after() methods.

If you look in the database, the dates are stored as 01-Jan-2009. It would be tempting to simply run a Hibernate query that tests equality. You could simply generate a base-lined date and ask Hibernate to return the objects that have this Date. And it works.

However... Databases are part of the seedier side of the software town. Lurking there is data imported from other systems and/or updated by rogue administrators. The best way to handle all possible situations correctly is to use the Hibernate Criteria's between() method:


getCurrentSession().createCriteria( MyItem.class )
.add(Expression.between("myDateField", base-lined-Date, end-lined-Date))
.list();

Where the base-lined-Date has been base-lined as above, and the end-lined-Date corresponds to a Date that is the last millisecond of the day:

static Date EndLineDate(Date d){
if(d==null)return null;
GregorianCalendar cal = new GregorianCalendar();
cal.setTime(d);
cal.set(Calendar.HOUR, 11);
cal.set(Calendar.MINUTE, 59);
cal.set(Calendar.SECOND, 59);
cal.set(Calendar.HOUR_OF_DAY,23);
cal.set(Calendar.MILLISECOND, 999);
return cal.getTime();
}

This solution pulls in all objects that have a certain Date defined by Day, Month and Year only.

Tuesday, March 24, 2009

Trinidad Draggable Dialogs

Users of Apache Trinidad may have run across the issue of not being able to move the Trinidad dialogs around the page. They always sit right in the center of your screen and of course block you from seeing the information behind. In release 1.02 the authors proposed that the dialogs should be made drag-able, however nothing has been done. The following is a simple solution I used to make the dialogs drag-able. If I had time, I would commit something to the project to handle this - although I believe the reason it has not yet been implemented is because of the amount of cross browser code needed to affect this. The following solution took about 5 hours to come up with and refine.

The trick with making the dialogs drag-able is that that the dialogs are generated after the page loads, use different markup depending on browser type and version, and the markup contains no ids on the DOM elements of interest. I implemented this for Firefox2, 3 and ie7. For other browsers, the dialog is rendered with standard behavior.


So lets get started:

Step 1: Add the mouse drag event script to the Trinidad dialog:

In your xhtml/jsp file that will display behind the dialog , add a trh:script call that is rendered conditionally based on browser. The call will need to wait until the page loads to execute, so I add a timer to delay invokation:

trh:script text="window.setTimeout('attachDraggabilityToDialog()', 500);"
rendered="#{UiUtils.supportsDialogDrag}"

partialTriggers="::pprDialogCallButton "


The text attribute calls the attachDraggabilityToDialog script 500 ms after the page loads ( I place the script at the bottom of the page. )

The rendered attribute calls a backing bean function that specifies whether the browser version is supported for dragging.

The Partial Triggers attribute references Trinidad element id that pops up the dialog - in this case the button.

Step 2. Create the script:

// the frame
var dialog_frameDiv= null;
// the title bar on the frame
var dialog_titleBar=null;
// the object to drag around
var dialog_dragObj = new Object();
var GECKO=0;
var IE=1;
var BR_VER=GECKO;
new Browser();




/**
* After the page has loaded, and the popup has been displayed,
* Call this function:
* e.g: window.setTimeout(attachDraggabilityToDialog, 500);
* NOTE _ WORKS IN FF2, 3 AND IE 7
*
*/
function attachDraggabilityToDialog() {
dialog_dragObj = new Object();
dialog_frameDiv= null;
dialog_titleBar=null;

if (BR_VER==GECKO) {
for(var i =0; i <
window.document.body.childNodes.length;i++) {
dialog_frameDiv= window.document.body.childNodes[i];
if(dialog_frameDiv!=null && dialog_frameDiv.nodeName=="DIV" && dialog_frameDiv.style.zIndex==5001) {
dialog_titleBar=dialog_frameDiv.childNodes[0];
break;
}
}
}
else if (BR_VER==IE) {
var _node = document.getElementsByTagName("iframe")[0].parentNode;
if(_node !=null && _node.tagName=="DIV"){
dialog_titleBar= dialog_frameDiv=_node;
}
}

if(dialog_titleBar !=null) {
if (BR_VER==IE) {
dialog_titleBar.attachEvent("onmousedown", callDrag);
dialog_titleBar.attachEvent("onmouseup", endDrag);
}
if (BR_VER==GECKO) {
dialog_titleBar.addEventListener("mousedown", callDrag, true);
dialog_titleBar.addEventListener("mouseup", endDrag, true);
}
}

}



Discussion: Based on browser version, we look through the generated markup in the page once the dialog has appeared and find the element based on attributes or expected location of the node in the tree. While not foolproof, this technique works in the many scenarios in our present application, as Trinidad generates the dialogs in a standard manner per browser and version.

The attachDraggabilityToScript injects event handlers onto the elements to be dragged. The event handlers are below:


function callDrag(event) {

var el;
var x, y;
dialog_dragObj.elNode = dialog_frameDiv;

if (BR_VER==IE) {
x = window.event.clientX + document.documentElement.scrollLeft + document.body.scrollLeft;
y = window.event.clientY + document.documentElement.scrollTop+ document.body.scrollTop;
}
if (BR_VER==GECKO) {
x = event.clientX + window.scrollX;
y = event.clientY + window.scrollY;
}


dialog_dragObj.cursorStartX = x;
dialog_dragObj.cursorStartY = y;
dialog_dragObj.elStartLeft = parseInt(dialog_dragObj.elNode.style.left, 10);
dialog_dragObj.elStartTop = parseInt(dialog_dragObj.elNode.style.top, 10);

if (isNaN(dialog_dragObj.elStartLeft)) dialog_dragObj.elStartLeft = 0;
if (isNaN(dialog_dragObj.elStartTop)) dialog_dragObj.elStartTop = 0;

if (BR_VER==IE) {
document.attachEvent("onmousemove", startDrag);
document.attachEvent("onmouseup", endDrag);
window.event.cancelBubble = true;
window.event.returnValue = false;
}
if (BR_VER==GECKO) {
document.addEventListener("mousemove", startDrag, true);
document.addEventListener("mouseup", endDrag, true);
event.preventDefault();
}

}


function startDrag(event) {

var x, y;


if (BR_VER==IE) {
x = window.event.clientX + document.documentElement.scrollLeft + document.body.scrollLeft;
y = window.event.clientY + document.documentElement.scrollTop+ document.body.scrollTop;
}
if (BR_VER==GECKO) {
x = event.clientX + window.scrollX;
y = event.clientY + window.scrollY;
}


// Move drag element by the same amount the cursor has moved.

dialog_dragObj.elNode.style.left =(dialog_dragObj.elStartLeft + x - dialog_dragObj.cursorStartX) + "px";
dialog_dragObj.elNode.style.top =(dialog_dragObj.elStartTop + y - dialog_dragObj.cursorStartY) + "px";

if (BR_VER==IE) {
window.event.cancelBubble = true;
window.event.returnValue = false;
}
if (BR_VER==GECKO) ;
event.preventDefault();
}


function endDrag(event) {
if (BR_VER==IE) {
document.detachEvent("onmousemove", startDrag);
document.detachEvent("onmouseup", stopDrag);
}
if (BR_VER==GECKO) {
document.removeEventListener("mousemove", startDrag, true);
document.removeEventListener("mouseup", stopDrag, true);
}
}



Discussion : The above event handlers move the DOM element based on mouse movements after the user mousesdown on the element, and releases once the user mouses up based on browser version.


Finally, I have a simple browser detect , knowing that my script will not be called unless the browser is Firefox2, 3 or IE 7.


function Browser() {
var ua, i;
ua = navigator.userAgent;
if ((i=ua.indexOf("MSIE")) >= 0) {
BR_VER=IE;
}
}


In my backing bean, I have the following browser detection code that conditionally renders the call to the script based on its findings:



/**
* Currently only ff2,3 ie 7
* @return tru if the browser is one of the above.
*/
public boolean getSupportsDialogDrag() {
String ua = UiBaseUtils.getRequest().getHeader("User-Agent");

if(ua ==null)
return false;

if(ua.contains("MSIE 7"))
return true;
if(ua.toUpperCase().contains("GECKO")){
if(ua.contains("Firefox/3."))
return true;
if(ua.contains("Firefox/2."))
return true;
}
return false;

}



That's it.

I believe the correct way to do this would be to donate time and code to the Trinidad Project. I hope to do this , but I needed to come up with a quick solution that worked. I was able to commit this in less than 24 hours after getting the requirement, so please take this solution with this caveat in mind. QA has found no issues, and user feedback has been positive.

Hibernate AliasToBean Transformer

Problem: You need to access values from tables that are not hibernate entities, in an aplication that uses hibernate to access the database.

Problem: You want to access a couple columns from any number of tables in your database without bringing back all the associated objects.

Solution: AliasToBean Transformer allows you to retrieve specific information in non entity beans.

This hibernate API call will allow you to run sql against your database and populate a list of pojos that are not a hibernate entity. This technique is great when you need specific information or perhaps you want information fom multiple tables.


Your pojo:

Class PlainOldObject {
String s1;
String s2;

public String getS1() {return s1;}
public String getS2() {return s2;}

public void setS1(String s1) {this.s1=s1;}
public void setS2(String s2) {this.s2=s2;}

}

Then in a hibernate dao :

publc List lookupPlainOldObects() {

// the sql with a reference to the s1 and s2 fields in our object
String sql="select tableX.col1 as s1, tableY.colx as s2 where blah blah blah";

List list = getCurrentSession().createSQLQuery(sql)
.addScalar("s1")
.addScalar("s2")
.setResultTransformer(Transformers.aliasToBean(PlainOldObject.class ) )
.setCachMode(CacheMode.GET)
.list();

return list;

}

This avoids your needing to cycle over multiple objects to pull in the information you need. It also alleviates the need to cycle over Object[] s to sort through your results.

A couple notes:

1. Use the set scalar method to convert the variables in your sql to the fields in your pojo.
2. Use CacheMode.GET to keep these ojects out of the hibernate session cache.




Monday, March 2, 2009

Passing Parameters via JSF

JSF allows us to cause updates to via various listeners: actionListseners, disclosureListeners, valueChangeListeners, and phaseListeners to name a few. Sometimes however, you may want to pass a parameter in to a generic method - perhaps to grab the value of an enumeration or access a users ability to view components based on Role.

One little known feature of JSF is that you may access elements of a java.util.Map. This allows us to pass in a key to a map object in one of our contexts, and the map passes back the value pair via the get(Object) method.

The notation in your jsf page would be:

value="${yourContextHandle.mapObject['mapKey']}"

where

yourContextHandle is an object in one of the contexts you use to access backing code
mapObject is a java.util.map that is available on the above object via getMapObject()
mapKey is the parameter passed in to find the object from the map.



In you backing code, generate an object that extends java.util.Map. You will need to implement quite a few methods, so I typically create a base object stub that handles the overriding of the methods beyond get(Object). In the below example, I call this MyBaseJSFMap. Then I extend this object and simply override get(Object);

But how do you get the answer you need?

Consider the following Map :


public Class MyJSFMap extends MyBaseJSFMap{

public Object get(Object o) {
return doSomething((String)o);
}

public String doSomething(String str) {
// your businessLogic here - perhaps a lookup to a database or a pull from a map.
}
}




Thursday, February 5, 2009

Understanding Hibernate Delete Orphan issues

Ever wonder how Hibernate knows to follow the annotation "CascadeType.DELETE_ORPHAN"?

The short answer is that Hibernate keeps track of removals from these collections. When you remove an object from the collection, the status of that object is set to Status.DELETED or Status.GONE if the object is still in the persistence context. When hibernate saves the object tree that contained these ex-collection objects, it sees that the object has been dereferenced, marked for deletion, and is then removed from persistent storage and the persistence context.

Thats how it is supposed to happen.

Ever wonder why you are getting the error :
org.hibernate.HibernateException: A collection with cascade="all-delete-orphan" was no longer referenced by the owning entity ?

Lets say that you have an object with a collection that is cascade type delete orphan, and it has some child entities in it. And lets say that you change this collection simply by calling the setter on the parent object:

parent.setChildren(someDifferentSet);

or

parent.setChildren(null);

The children that were in the collection are still in the persistent context. However, they are no longer referenced by the parent object, and Hibernate never said goodbye to them. In the Collections class (org.hibernate.engine.Collections) , the method processDereferencedCollection(Collection, Session) throws a hibernate exception when it finds these items in the persistence context, and their Status is not set to GONE or DELETED. The ensuing error message: 'A collection with cascade="all-delete-orphan" was no longer referenced by the owning entity: ' followed by the parent entity name is shown.

The proper way to handle this situation where you want to replace an existing collection would be:

parent.getChildren().clear();
parent.getChildren().addAll(someDifferentSet);

Similiarly, if you just wanted to remove a child from the collection:

parent.getChildren().remove(child);

By calling the collections manipulations, hibernate is kept up to date with the status of the objects within and without the collection. It then manages the objects in the persistence session properly.



Sunday, January 11, 2009

Dynamically generated Trinidad Component Trees and Input Values

I recently generated JSF pages dynamically with components from Apache Trinidad. The results are wonderful - and saved a great deal of iterative development time by allowing business analysts to specify the UI for various business scenarios. (For those interested I explain the basics of doing this later in this blog.)

We of course find a few issues:

1. When Using PPR, values do not always match bindings.
2. Components sometimes disappear!

Some of the components used PPR on the front side to update other components. I ran into a few issues while validating these fields - the values did not match the values shown in the UI. I scratched my head and rolled up the sleeves.

The issue is that JSF has 2 notions of values for an input component:

1. The component binding.
2. The component value.

The component value is typically the value of the binding. However, if you perform an XmlHttpRequest (PPR), these values may be out of sync. The scenario I used was that I generate the components and specify bindings to backing beans. This works fine until a PPR request is made and you update the model. The UI value becomes out of sync with the backing model. There are a few solutions to this issue:

1. Call comp.setValue(the value) on the actual component. This will require accessing the component directly.
2. Call comp.resetValue() - this syncs the value to the binding but also requires that you access the component on the server.
3. Use PartialTriggers when generating the component. This requires that you specify the id of the components that will cause a refresh of the value, and that you keep track of these ids. Then you specify the partial triggers on the component at the time of component generation.

I found that solution 3 works best for me, as I found another issue with generating components server side and using AJAX on the front. Sometimes, inexplicably, components would disappear. The reason for this is that Trinidad generates ids for components, and when you have components generated on the fly, there seems to be competition among the components for the ids. This seems to be a result of mixing generated and dynamically generated ids. The solution for this is to specify all ids for your generated components. Coupled with the task of specifying partialTriggers and using ids for javascript calls, this is the best comprehensive and cohesive approach.

A little example of a dynamically generated UI :


In your UI page you simply add a container with a binding to a managed bean. In the following line, I specify a panelAccordion (without opening and closing brackets) :
tr:panelAccordion binding="#{myManagedBean.accordionX}"

then in your backing bean you generate the PanelAccordion:

CorePanelAccordion accordionX = new PanelAccordion();

Make sure that you provide public access to this component. You then build the component by adding boxes, tables, and inputs as you need. You also set the various attributes on all the components.

For instance, lets add a Detail Item with a box and an input with a button:

FacesContext fc = FacesContext.currentInstance();
CoreShowDetailItem tab1 = new CoreShowDetailItem();
// the following names are the handles you would call in the ui.
String tab1Binding = "myManagerKeyString.tab1Name";
tab1.setValueBinding("binding" , fc.getApplication().createValueBinding("#{" + tab1Binding + "}"));

tab1.setDisclosed(true);



CorePanelBox pb1 = new CorePanelBox();
pb1.setStyleClass("yourStyleClass");



CoreInputText comp = new CoreInputText();
comp.setId(genYourId());
comp.setLabel("Label for Component");
comp.setConverter(new YourCompConverter());

// This is the value binding to the backing pojo
String compBindString = "myManagerKeyString.pojo.compAttributeRelationName";
comp.setValueBinding("value", fc.getApplication().createValueBinding(#{"+ compBindString + "}"));


CoreCommandButton button = new CoreCommandButton();
button.setText("GO");
addButton.setId(genYourId());

// create a methodBinding
String actionBindingString = "myMangerKeyString.go";
MethodBinding mb = fc.getApplication().createMethodBinding("#{" + actionBindingString + "}"));
button.setAction(mb);


Now lets add the components:


pb1.getChildren().add(comp);
pb1.getChildren().add(button);
tab1.getChildren().add(pb1);
accordionX.getChildren().add(tab1);

Then when the UI is displayed, it references accordionX, and in the backing bean you generate this UI. Your managed bean needs to provide access to the actions and the pojo. This was a simple case and you can of course enhance you components by specifying any more attributes.

A couple pieces of advice:

1. Specify styleClasses and use css to control layout.
2. ValueBindings can be used to specify all the attributes that you normally specify client side. For instance, if the style of a component is bound to a server side call:

String styleBinding= "myMangerKeyString.someStyleBindingCallName";
comp.setValueBinding("styleClass", fc.getApplication().createValueBinding("#" + styleBinding + "}" ));

You can use this to bind any attribute to a server call.


In a recent application, I used xml to specify the fields and their attributes possible in a UI. I then allowed business analysts to choose which components displayed when, and even some of the attributes such as label and order. Then, when the call is made to display a container, I generate the UI based on their selections. It worked quite well.
























Tuesday, December 30, 2008

Hibernate and EnhancerByCGLIB issues

Working with Hibernate can make performing routine database operations a breeze. Every now and then however you hit a snag that causes your brain to drop on the floor. I recently ran across one of these situations where objects were not being updated. After checking my syntax and inspecting objects, I realized the issue had to do with the fact that Hibernate will often create proxy objects as classes of type EnhancerByCGLIB when pulling collections that are marked as FetchType.LAZY. It also occurs when the SessionFactory is asked to load() the object instead of get().


First Case : Lazy Initialization Proxies:

The proxies may often contain all the information in your persistent class, and so they appear to be the real deal. When we query these objects they return all the information in the underlying proxied object. We run into a couple issues however if we use standard implementations of equals in our persistent class:

Lets say your equals() method in the class Noodle looks like this:

public boolean equals(Object obj) {
if(this == obj)
return true;
if(getClass() != obj.getClass() )
return false;
........
}


At this point we see that equals will never succeed - the class Noodle will never equal the Hibernate Proxy class. When you try to update an object from the collection, Hibernate never finds the object it needs to update because the equals implementation always fails. If you need to implement a comparison of class you could use one of the following :

this.getClass().isAssignableFrom(obj.getClass())

or

HibernateProxyHelper.getClassWithoutInitializingProxy(obj)

If you use hibernate annotations in your pojos, this additional Hibernate checking should not be too invasive or out of place. One other mistake people often make when implementing equals is to try to access the fields directly off of the object:


public boolean equals(Object obj) {
....
Noodle other = (Noodle)obj;
if(! this.attributeA.equals(other.attributeA) )
return false;
....
}


Here we run into 2 issues:

1. The cast may throw a ClassCastException.
2. The attributeA may have limited visibility.

A Solution that worked for me and covers the situation where the other object is not a proxy:

public boolean equals(Object other) {

if ( this == obj )
return true;

if( obj == null )
return false;

if( getClass().equals( obj.getClass() ) {
Noodle other = (Noodle)obj;
if( getId().equals( other.getId() )
return true;
}

else if( obj instanceof HibernateProxy ) {
if(HibernateProxyHelper.getClassWithoutInitializingProxy(obj).equals(this.getClass() ) {
Noodle other = (Noolde)obj;
if( getId().equals( other.getId() )
return true;
}
}

return false;
}

Second Case: You used Load() instead of get():

There may be times when you want to use load() - when you are simply adding references to an object in a collection for instance. But when you are pulling an object specifically to use its values, you should switch to using get() - as the attributes are guaranteed to be available.


Monday, December 8, 2008

Notes from a Windows to Mac Conversion Bender

I recently upgraded from my 2 year old pc to a new mac book pro. It was a tough decision because I need to be able to run oracle and the customer I am working with uses WebEx AIM Pro for desktop sharing. (I work remotely 75% of the time.) . The final decision to go mac was heavily influenced by the fact that I could use vmware to run an instance of my windows pc on the mac. The fear of Vista (Vistanoia?), the desire to leave the world of waiting for apps to respond, and the chance to try out a new toy were beckoning.

I pulled the trigger, and jumped on the configuration express.

I've been in config-hell before, and something deep down told me this would be no different. All the smiling faces that I met saying that changing to a mac was like a trip to disney surely had to have an unseemly downside lurking in a hidden log file.

The basic setup was pleasant - eclipse and tomcat installed as on a pc by simply unzipping them into the directory where they would be run from:

%> sudo gunzip apache-tomcat-5.5
%>sudo tar -xvf apache-tomcat-5.5
%>cd apache-tomcat-5.5/bin
%>./startup.sh (You are now running tomcat)

Eclipse was similiar.

One thing to get used to on a mac is that you no longer download jdks from sun. The jdk is preinstalled on your machine. You go to the Apple site and download the jdk for the mac if you need a different version or need an update. Apple doesnt seem to supply options for minor versions.

I ordered up a copy of VMWare fusion and planned to install my old pc on the mac as a resource - the ability to access old email, documents etc without having to go back to an old machine. However I had overlooked one issue - the windows xp had been installed on my pc when I bought it, and when I booted it up on the vmware, windows insisted on an activation key. The old one on the laptop was no longer valid, and so I had three days of use before the activation grace period expired. In that time I copied files over to the mac and made the most of the situation. The VMWare fusion worked greeat - and it was phenomenal to see the old pc running on the mac.

I would suggest taking snapshots of your OS in vmware, and make sure you keep the original so that you can always return to it. After three days however, I said goodbye to my old pc on the mac.

The next step for me was to get Oracle installed on the Mac. This was simple enough by installing Ubuntu on a vm in the vmware fusion. Again, vmware worked like a charm and Ubuntu is a very nice and user friendly. I used Ubuntu 8.10 and downloaded an OracleXE database. Everything went smoothly, and the database runs without issue. With vmware I can actually drag and drop files between desktops, and the database has been running without issue.


On Ubuntu, before you start running all over the net looking for apps or jdks to download, use the Synaptic Package Manager under the System tab on your desktop. Real simple.

The stumbling block for me was that the project I work on uses sqlldr to load the database. This executable processes a flat file and inserts the data in the oracle database based on a template definition. I wanted to run the eclipse and tomcat on the mac, and the oracle instance on the Ubuntu vm. I read that installing the oracle client on the mac was possible, but when I did, I found that it did not come with the sqlldr utility.

My first plan to circumvent the issue ws to run eclipse and tomcat on the Ubuntu instance. This worked, but eclipse had issues - perspectives needed to be configured with each new restart, and I found eclipse crashed in the vm frequently. I needed an environment that was stable and able to handle heavy loads on the processor and manage the memory well. This was not the case on Ubuntu in the vmware virtual machine.

The next idea was to install A Samba server on Ubuntu. Samba allows you to access other machines, in this case the virtual machine on my mac. The install call was

%> sudo apt-get install samba smbfs

This allowed me to access files across machines - however I was not able to execute the sqlldr command from the mac on the Ubuntu machine.

My next plan of attack is to update to the OS/X Server and install the Oracle Client for 10g.

And so it goes...









Monday, December 1, 2008

Tomcat and Eclipse on Ubuntu

The following is a quick review of the steps I took to get eclipse and tomcat up and running and playing well together. Replace "user" with your local user name. I probably opened things up a bit wide security wise. It works. 
  • Download the tar.gz files for linux from the respective web sites. 
  • gunzip the files 
  • tar xvf the files. 
  • Move Tomcat to /usr/share/tomcat
  • sudo chown -R user:user tomcat
  • sudo chmod -R a+rwx tomcat
  • sudo chmod +x `sudo find tomcat -type d`
  • Move eclipse to /opt
  • sudo chown -R  user:user eclipse
  • sudo chmod -R a+rwx eclipse
  • sudo chmod +x `sudo find eclipse  -type d`
The net affect of the above is that the applications are deployed to their respective locations and the user "user" may run them. I was able to run each separately, but was having issues with eclipse trying to run tomcat. These errors were due restrictions in writing logs and deploying to tomcat directories due to permission issues. The above commands allow all users to write to these directories. 

Next, you can create an executable that specifies an ECLIPSE+HOME variable: 
Create an executable for eclipse in usr/bin called eclipse
  • #!/bin/sh
  • export ECLIPSE_HOME="/opt/eclipse"
  • $ECLIPSE_HOME/ECLIPSE $*
Save the file and run clean: 
/opt/eclipse/eclipse -clean

Add the two applications to the menu via System>Preferences>Main Menu. 

In eclipse, add Tomcat as a server, and in the tomcat configuration, under "server Locations", choose "Use Tomcat Installation" radio button and save. 



SQLDeveloper on Ubuntu

To Install SQLDeveloper, download the application from http://www.oracle.com/technology/software/products/sql/index.html. Unzip the archive and move it to the directory of choice. I chose /usr/share. Change the file to be executable: 

  • unzip sqldeveloper.zip
  • sudo mv sqldeveloper /usr/share/sqldeveloper
  • sudo chmod +x /usr/share/sqldeveloper/sqldeveloper.sh 
Then invoke the sqldeveloper.sh : 
  • sh sqldeveloper.sh
You will be asked to specify the path to the java home. You are up and running. 

To add SQLDeveloper to your menu, Go to System?Preferences>Main Menu. Add SQLDeveloper to the menu of choice. The Application is now executable from your menu the next time you log in. 

Install Samba on Ubuntu

I have a vm on my Mac running as a guest with Ubuntu as the OS. In order to access files on the vm guest, I added Samba - a file and print server. To install Samba,  and edit the configuration: 
  • sudo apt-get install samba smbfs (Installs Samba)
  • sudo gedit /etc/samba/smb.conf (Edits the config file)
Then, go to the "Share Definitions" section of the config file that is open in your editor: 

find the lines:
  • ; [homes]
  • ; comment = Home Directories
  • ; browseable = yes

Remove the ";" to uncomment these. This allows you to access the home directories on the guest.  Then find the following line that specifies the read only attribute. It is set for yes by default. To be able to write to your directories, change this to "no" and uncomment it. 
 
  • read only=no  (if you want to be able to write to the directory)

Next you need to find the line "; security = user". Replace it with the following 2 lines: 
  • security = user
  • username map = /etc/samba/smbusers
This specifies that users may authenticate, and that the users are specified in the /etc/samba/smbusers file.  Next we need to add entries to the users file. 
  • sudo gedit /etc/samba/smbusers
The file will be empty. If you want to access your local profile, and lets say the login name is Gus, then add the following line to smbusers: 

  • Gus="Gus" 
Save the file and from the command line create a Samba password: 
  • sudo smbpasswd -a Gus
You will be prompted to enter and confirm the password for this user. Thats it , you are done with the install of Samba. 


To stop | start | restart Samba: 
  • /etc/init.d/samba stop | start | restart
The config file allows you to change the behaviour of the samba server. There are comments in the file that point you to resources for evolving the file and perhaps opening up you system to domains. 

To Access the samba server, go to : 
  • smb://computer_name/Gus
On a Mac, this is under Finder>Go>Connect to Server>.  You will be prompted for a authentication with the password you specified to smbpasswd. If authentication works, the disk will mount on your desktop, and you will be able to browse within your profile on the remote machine. 


Sunday, November 30, 2008

Install Java on Ubuntu

I thought this would be straight forward....

I tried to install using the apt-get command, but my install died when the license agreement took over my terminal, and did not give me a way to accept the agreement. I exited and ran the dpkg --configure -a command to clean up the failed install. 

Instead I went to the System>Synaptic Package Manager and performed the install this way: 
  • Specify "java" in the search box. A List of packages to install appears.
  • I chose the java5 jre, jdk and plugin, and accepted the installation of packages that they were dependent on. 
  • Click Apply
  • Accept the license agreements as they appear.
  • The java home directory will be in /usr/lib/jvm
Note: If the java versions do not appear in the above list, ensure that the /etc/apt/sources.list file contains the following entries : 

  • deb http://us.archive.ubuntu.com/ubuntu/ intrepid multiverse
  • deb-src http://us.archive.ubuntu.com/ubuntu/ intrepid multiverse
  • deb http://us.archive.ubuntu.com/ubuntu/ intrepid-updates multiverse
  • deb-src http://us.archive.ubuntu.com/ubuntu/ intrepid-updates multiverse

You then need to ensure that this Java is the default java for the system: 
  • update-java-alternatives -s java-5-sun (specifies java 5 as the default)
  • update-java-alternatives -l (lists out the default)

Finally, add environmental variables. I add them in /etc/environment. You could also add them in your local .bashrc file in your home directory. The values specified in /etc/environment are in all shell. Be careful of the syntax in this file! If an error occurs you may not be able to boot up... : 

  • JAVA_HOME="/usr/lib/jvm/java-1.5.0-sun"
  • CLASSPATH="/usr/lib/jvm/java-1.5.0-sun/bin:/usr/lib/jvm/java-1.5.0-sun/lib"




Install Oracle XE on Ubuntu

Oracle XE can be downloaded at the Oracle site. I chose the Debian install for Linux. I read the installation guides. This is how to do it:

If you do not have 1GB free memory, or want to reduce the database footprint:

  • $sudo dd if=/dev/zero of=/swpfs1 bs=1M count=1000
  • $sudo mkswap /swpfs1
  • $sudo swapon /swpfs1
I used the Ubuntu apt-get utility to install the database.
  • sudo gedit /etc/apt/sources.list
Insert "deb http://oss.oracle.com/debian unstable main non-free" into the file if it does not exist and save. Then import the Key:
  • $wget http://oss.oracle.com/el4/RPM-GPG-KEY-oracle -O- |sudo apt-key add -
Now install the db:
  • $sudo apt-get update
  • $sudo apt-get install oracle-xe
After the installation you need to run the following:
  • sudo /etc/init.d/oracle-xe configure
Follow the instructions and write down the ports and passwords that you specify. Then you will need to update the users on the system. Go to System>Administration>Users and Groups. Unlock the editor and under the manage groups tab, add your user to the dba group.

The database should be running. Go to http://localhost:8080 - assuming you didn't change this port in post config, and the database info page should appear. The database is now integrated in the menu as well - so if not running, start it it up by navigating to the Applications>Oracle Database>Start Database.

One more step is to specify the ORACLE_HOME and ORACLE_SID environmental variables:
  • gedit $HOME/.bashrc
Add the following lines:
  • ORACLE_HOME=/usr/lib/oracle/xe/app/oracle/product/10.2.0/server
  • ORACLE_SID=XE
  • export ORACLE_HOME ORACLE_SID
You could also/instead add these to your /etc/environment file. In that file you would add:
  • ORACLE_HOME="/usr/lib/oracle/xe/app/oracle/product/10.2.0/server"
  • ORACLE_SID="XE"
Either login in again, or type
  • source .bashrc
  • source /etc/environment
To run Oracle utilities add /usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin to the path in /etc/environment and source it.


You should be good to go. You can access the database home page via the menu, or go to http://localhost:8080/apex.





Install VMWare Fusion on Mac OS/X 10.5

I needed to run an oracle database on a macbook. At the time, Oracle was not supported on Mac OS/X 10.5 . Talking to colleagues, and doing some research led me to VMWare fusion. It's not free. It works great.

Installing VMWare Fusion is a relatively simple process. I downloaded version 2.0.1-12 from the net and purchased a serial number from VMWare. The VMware site has excellent documentation and even supplies videos to walk you through the process. Here's what I did: 
  1. Download the .dmg file from vmware. (http://www.vmware.com/vmwarestore/buyfusion.html).
  2.  Double click on the .dmg file to mount it. The file will be mounted and displayed in a Finder window.
  3. Double click the Fusion Icon. The installation assistant will walk you through the process. 
  4.  To Start Fusion, navigate to Applications in Finder, and choose the VMWare Fusion Icon. 

The Fusion window will open, and you can create a new Virtual Machine at this point. Next I installed 2 different virtual machines - a copy of my old pc and a fresh Ubuntu install to house the Oracle Database. 

I followed the conversion guide for bringing my old pc over. The guide that comes with fusion (http://www.vmware.com/pdf/fusion_getting_started_11.pdf) has a link to a flash video that walks you through the process. The process took about 8 hours for the vmware converter to bundle my old pc onto an external disk drive and for me to boot the pc on the mac. It was pretty cool. The problem that I had was that my copy of XP only had 1 license, (it came bundled on the pc when I bought it) , and Windows required a new activation key. So I only had access to my old pc on the mac for 3 days. I would not do this unless you know you have multiple licenses. 

To install Ubuntu, I went to the Ubuntu site (http://www.Ubuntu.com/getubuntu/download) and downloaded the 8.10 version on 32 bit. I chose the 32 bit because the 64 bit has some issues with downloading and installing packages. Once downloaded, simply open the vmware fusion :


  1. Choose new 
  2. Click the continue without disk button. 
  3. Choose the "Use Operating System Disk image file" option.
  4. Choose the Ubuntu file you downloaded above from the finder that opens. 
  5. Click Continue and Finish. The new virtual machine will appear in the Fusion window. 
  6. Click the start button next to the Ubuntu vm, and fusion will open a window with the Ubuntu installation running within it. 



I ran through the installation by specifying users and some configuration. I simply accepted the defaults. 

Sage Advice
Install VMWare tools immediately after the Ubuntu installation occurs. Choose Install VMWare tools from the virtual machine menu item on the mac. To do this, move the untarred vmware tools distib to the /tmp directory.  Then run : 
  • sudo ./vmware-install.pl
Choose the defaults, and when complete, add the toolbox to the startup of your Ubuntu session: 

  1. Preferences>Sessions>StartUp Programs>+Add>
  2. Specify vmware-toolbox as the command to execute. 

Fusion allows you to take snapshots of your vm. DO THIS!!!! . Comment well the snapshot state, so that when you screw something up you can return to a stable state.  As I was installing the various applications to the Ubuntu system , I would stop and take a snapshot along the way after each milestone. This will save you considerable time. 




Thursday, November 27, 2008

eclipse on mac

I run eclipse against an oracle instance with htmlUnit tests banging away on tomcat. Running the complete suite of unit tests on my dev box can eat processor and memory big time. Its like watching a kid coming home from college for the first time in months and invading the fridge. On a pc there was no processing time left for me. When I set up my env on the mac, and ran with the base settings, it died by running out of heap space.

So I try to allocate enough memory and set the parameters on the various running jvms (ant, tomcat and eclipse) in the hope that they wont die after running for 50 minutes with a heap space space issue. My box has 4G of memory, and I have oracle running in a separate vmware fusion vm on Ubuntu. I am using eclipse Ganymede with java 5 (1.5.0.13) , Tomcat 5.5 and and use ANT4.7.

On Mac, the eclipse.ini file is under your eclipse installation at /Eclipse.app/Contents/MacOS/eclipse.ini . You will need a terminal window to get there as the directory doesn't show up in Finder. You could perform the following to find it:

sudo find / -name "eclipse.ini"

When I opened this for the first time I found that the parameters were triplicated - so I cleaned this up. I jack up the max memory, specify to use parallel GC, and keep the eden space low. I specify the following arguments:

-Xmx1024m (the max heap size)
-Xms128m (Startup heap size)
-Xmn64m (the eden space or young generation garbage collectible space)
-XX:+UseParallelGC (Run Parallel GC)


In java there are two GC threads running. One is a very lightweight thread which does incremental collections primarily on the Eden (a.k.a. Young) generation of the heap. The other is the Full GC thread which traverses the entire heap when there is not enough memory left to allocate space for objects which get promoted from the Eden to the older generation(s). If there is a memory leak or inadequate heap allocated, eventually the older generation will start to run out of room causing the Full GC thread to run continuously. This GC thread eats the processor and eventually eclipse has no pie - it won't be able to respond to requests and they'll start to back up.


The amount allocated for the Eden generation is the value specified with -Xmn. The amount allocated for the older generation is the value of -Xmx minus the -Xmn. Generally, you don't want the Eden to be too big or it will take too long for the GC to look through it for space that can be reclaimed. If you watch the memory monitor closely, you can see the garbage collector releasing memory periodically. Keeping the amount allocated to the eden space low affects the periodicity of this cleanup. Developers can see this on their machines when running the complete stack of tests.

These parameters seemed to work for me- my memory consumption remained nearly constant across the task. Before, the memory usage would creep up and reach a terminal point - on my old pc at about 2.4GB. At this point the tests just stopped running. Viewing the memory usage after applying the new args, you could see where GC ran periodically and brought the usage back to a baseline. The tests ran faster. But there was still an occasional lockup.

Another consideration is whether to run ANT, Tomcat and Eclipse in the same jvm. I tried experimenting with combinations of virtual machine with these processes. I also experimented with the memory allocations for each. What I found was that if you run them all in the same jvm, then the memory allocation on your machine remains high even after the tasks complete. When you run these in separate jvms, the memory allocation is returned to the system as each process completes. I found that the machine performed best when running separate jvms.


Caveat Emptor: Please be aware that not all jvms support the run parallel GC option.

To specify the jvm parameters for ANT, go to the ANT view, right click on the build file or a task, click 'Run As' and then 'external tools configuration'. In the dialog that appears, choose the JRE tab and specify the vmargs and the jvm to use for the file/task.

To Specify the jvm parameters for Tomcat, double click on the tomcat instance in the servers view. Update the vmargs by clicking the Launch Configuration link.