Sams Java Media APIs Cross Platform Imaging Media And Visualization Dec 2002 ISBN 0672320940

  

  Brought to You by

  ;BasicRecipeJ3D.java [ shapes 2D

  2D arrays 2D image data 3D

  3D culling

  

   3D models

  3D utilities 4x4 matrix

  

   abstraction addressing ADPCM format addTarget() method alpha components AlphaComposite object

  AlphaWorks animation animations

   JMF

  graphics 2D arrays

  [ ]

  

  [

  

   Landscape

  

   ColorSpace class colvolving Component objects

   computing rotations constants containment content branch branches Content graph content types context

  Controller transitions controlling time-based media converting coodinating systems coordinate spaces createSendStream() method creating

  

  

   DataHandlers DataSinks DataSource class DCT DecalGroup class defining points

   destination images

  devices downloading draw() method drawing drawArc() method 3D scenes lines

  

  [ JMF extenders (border) extending

  

  [

  frustums frustum frequency operators functionality

  

   getOutputLocator() method getRaster() methods getState() method getUserData() methods GIF graphics animations containment

   loading [See also Content graphs

grouping optimizations

   SharedGroups

   Head hidden surface removal (HSR) algorithms HSR (hidden surface removal) algorithms

  

  

  IEC IETF

  

  

   Image objects

   ImageComponent object image-precision ImageComponent2D ImageConsumer interfaces ImageLayout object

  ImageReaders rendering

   Java

Imersadesk

   triangles

   immersion

   input devices insidefaces instantiating

   user interface VideoRenderer interpolators

  

  IP

isKnown() method

  [

   rendering independence ] Java-based imaging

  JFC implementing

   JMF API high level time model

  

   Leaf objects

   Applet class

  

  

BasicRecipeJ3D.java BasicRecipeJ3D.java applications BasicRecipeJ3D.java

   graphics loading

  

   mailing lists Manager classes managers 4x4 matrices media

  

  

   MJPG

  3D graphics 3D models

  

  

  [ native file format

   object-precision

   ImageComponent TransformGroup

  Boolean rotational operations operators

  

   PackageManager class Paint interfaces paint() method PaintContext interface PAL

   parameters

  

   PCM PET imaging perspective projections

   Player interface platforms virtual

  

Player objectrs playing

  

  

  Processors ] event-driven

   programming paradigms properties propriocetion Pulse Code Modulation

  

  [ Quantization querying

   rasterization

  SampleModel class rates frame reading realize() method redundancy registering

   text

   requirements resources

  RGB

   rotations RPTManager objects computing streams

  RTCP RTP sessions

   SampleMOdel class saving scaling scene graph compression optimizations 3D geometry

  

  

   SceneGraphObject scene objects screen space security self-motions signals immersion

   Sensor

   setEnabled() method shading triangles 3D 2D shapes

  

spaces spatial tranformations special objects

  Specular Reflection interactions speeds rendering

  SSRC states statistical operators steps storing information

  

subclasses

  

  Switch nodes Synchronizing Source synchronizing systems

  

   tasks

   texels text

   animations

   [See

Time to Live JMF

   trajectory transfer() method Controller transitions Transparent Transmission trees

  

  [ user interfaction interactions users 3D utilities utility classes

  

   vertices vestibular View graph view frustum culling View model

  

   3D graphics viewing viewing parameters virtual worlds VQ viewing volumes

  

   wands Web sites

  [

  XML

  [ yaw

  [

  Control

It's All About

  The various objects provided by the JMF such as Player s,

  Processor s, DataSource s, DataSink s, and plug-ins have

  complex and configurable behavior. For instance, it is desirable to allow the frame that a Player starts playing from to be set, or the bit rate for a to be specified. The JMF provides a

  codec

  uniform model for controlling the behavior of objects through the interface.

  Control

  Many JMF objects expose objects through accessor

  Control

  (get) methods. These can be obtained and used to alter the behavior of the associated objects. Indeed, there is an interface known as Controls that many objects implement as a uniform means of providing objects that specify their behavior.

  Control

  Hence the standard approach in tailoring a Processor , , , or to match a particular need

  Player DataSource DataSink

  is as follows:

  DataSink

  Obtain the Control object appropriate to the behavior to be configured.

  Use methods on object to configure the behavior.

  Control

  Use the original , , , or

  Player Processor DataSource DataSink .

  Because of some unfortunate naming choice for classes and interfaces in the JMF, the following classes bear similar names:

  Control , Controls (x2), and Controller , as well as a delineate the differences among these confusingly named classes:

  Control An interface describing an object that can be

  used to control the behavior of a JMF object such as a

  Player or Processor . The Control interface is

  discussed in this section. It is extended to a number of specialized Control Interfaces, such as .

  FramePositioningControl

  An interface upon which and

  Controller Player Processor are built and which is intimately associated

  with the timing model of the JMF. The

  Controller

  interface was discussed in an earlier section of this chapter.

  ( and

  Controls javax.media.Controls javax.media.protocol.Controls ) An interface

  implemented by objects that provide a uniform mechanism for obtaining their Control objects(s).

  javax.media.control A package within the JMF API

  that contains 18 interfaces that extend the basic

  Control

  interface. Examples include , , and

  FramePositioningControl TrackControl . FormatControl

  The Control interface itself is particularly simple, possessing a single method only, with the real control functionality being specified in the various 18 Control interfaces that extend in the JMF API. Each of those interfaces has a specific

Control

  functionality as detailed by its name and methods. Instances of these interfaces are the means of configuring the behavior of the object from which they were obtained. The following list

  A control for specifying and querying

  BitRateControl

  the bit rate settings, such as the encoding bit rate for a compressor ( ).

  codec

  A control for querying and specifying

  BufferControl buffer thresholds and sizes.

  FormatControl A control for querying the format

  support as well as setting the format for the associated object.

  A control for enabling the

  FrameGrabbingControl grabbing of still video frames from a video stream.

  A control to allow the

  FramePositioningControl

  precise positioning of a video stream as either a frame number of time (from start).

  FrameProcessingControl A control to specify the parameters employed in frame processing.

  FrameRateControl A means of querying as well as setting the frame rate.

  

H261Control A control for specifying the parameters of

the H.261 video codec.

  

H263Control A control for specifying the parameters of

the H.263 video codec.

  A control for specifying or querying

KeyFrameControl

  the KeyFrame interval: the interval between transmission of complete (keyframes) rather than delta frames in codecs (such as mpeg) that use temporal based compression.

  A control for specifying the degree of

  MonitorControl

  monitoring (viewing or listening to) of media as it is captured.

  MpegAudioControl A control for specifying the parameters of MPEG Audio encoding.

  

PacketSizeControl A control for specifying the packet

size parameters.

  A control to access the input and output

  PortControl ports of a device (such as a capture device).

  QualityControl A control for specifying the parameters

  of quality (higher quality generally comes at the expense of higher processing demands).

  A control for specifying

  SilenceSuppressionControl

  the parameters of silence suppression. Silence suppression is an audio compression scheme whereby silent passages aren't transmitted.

  A control by which the maximum

  StreamWriterControl

  size for an output stream (for example, DataSink or Multiplexer) can be set as well as the size queried.

  A control to query, manipulate, and

  TrackControl control the data of individual media tracks in a Processor.

  Each of these interfaces can be found in the

  Control

  package, which necessitates the

  javax.media.control import of that package if they are to be employed.

  As mentioned previously, each interface possesses methods specific to its functionality. Some are simple such as

  

FrameGrabbingControl with its single method grabFrame()

  that returns a object; others such as

  Buffer

  have more than a dozen methods plus

  MpegAudioControl

  associated constants. However, most interfaces are small, with 35 methods, and quite easy to understand.

Visual Control for the User

  Time-based media, as defined in the

  presentation to a human being. It is natural, then, to provide the viewer or listener with maximum control over that experience. To that end, many Control objects have an associated visual . That can be obtained

  

Component Component

and added to the graphical user interface provided for the user.

  Actions upon the result in method calls on the

  Component

  associated object that hence alter the behavior of the

  Control associated , , , or . Player Processor DataSource DataSink

  The interface that is the superclass of the previous 18

  Control

  possesses a single method, getControlComponent() , that returns an a graphical component that can be

  AWT Component

  added to a graphical user interface and through which the user can directly and intuitively set the control parameters. However, not all have an associated graphical

  Controls

  . Those that don't have a return

  Component Component null

  to the getControlComponent() method call. Thus code using the method should check to ensure that a non- reference

  null

  was returned before attempting to add the to an

  Component AWT Container (for example, Applet , Frame , or Panel ).

  Adding a to a will result in an

  null Component Container exception being thrown. Control

Getting Objects

  There are two methods by which objects can be

  Control

  obtained. These methods, getControl(), and are defined in the interface. The

  getControls() Controls

  interface is extended by many important interfaces

  Controls

  including , , , and the various pull

  DataSink Codec Renderer

  and push data and buffer streams. Other important classes such as (that is, the superclass of and

  Controller Player Processor ) also provide the two methods.

  The getControl() method is used to obtain a single Control object. The method is passed the complete name of the

  Control class as a String and returns an object

  implementing that interface. The fully qualified name of the class must be passed, thus listing the package "path" to the class. Then the method returns an object of class ,

  Control

  necessitating it to be cast to the type of Control before its methods can be employed. For example, BitRateControl bitControl = (BitRateControl) processor.getControl("javax.media.control.BitRateControl"); The second means of obtaining objects is through the

  Control getControls() method. The method accepts no arguments

  and returns (supposedly) all the objects, as an array,

  Control

  associated with the object on which was

  getControls()

  called. Look at the following example: Control[] allControls = player.getControls(); As an example of the objects associated with a

  Control Player ,shows the Control objects obtained from

  the used in the example of

  method of the object (actually

  getControls() Player inherited from ) was called once the was

  Controller Player realized and the objects returned were printed.

  Listing 8.6 The 11 Control Objects Obtained on a Particular Object when Was Player getControls()

  Called

  11 controls for a Player @ REALIZED: 1: com.ibm.media.codec.video.mpeg.MpegVideo 2: com.sun.media.codec.video.colorspace.YUVToRGB 3: com.sun.media.renderer.video.DDRenderer 4: com.sun.media.renderer.audio.DirectAudioRenderer$MCA 5: com.sun.media.renderer.audio.AudioRenderer$BC 6: com.sun.media.PlaybackEngine$BitRateA 7: com.sun.media.PlaybackEngine$1 8: com.sun.media.controls.FramePositioningAdapter 9: com.sun.media.BasicJMD 10: com.sun.media.PlaybackEngine$PlayerTControl 11: com.sun.media.PlaybackEngine$PlayerTControl It is worth noting several points in connection with .

  First, the case corresponds to the

  11 controls Manager

  being instructed to create Player s that support plug-ins for demultiplexing, codecs, and so on. In the case where the created Player wasn't plug-in enabled, only three Controls were obtained. Enabling plug-in based was achieved

  Players

  as follows: Manager.setHint(Manager.PLUGIN_PLAYER,new Boolean(true)); Second, all the returned by the previous call are

  Controls

  from the Sun and IBM packages (com.sun.media and com.ibm.media) and hence aren't documented in the API. Further, only the had a visual component,

  BasicJMD Control that being the PlugIn Viewer. That combination of undocumented classes plus lack of visual components makes the approach of exerting control not

  getControls() particularly helpful and next to useless, at least in this case.

  However, it appears that controller's method

  getControls()

  doesn't actually return all possible Control objects for a . As noted previously, rather than asking for all

  Player

  objects, it is possible to ask for them by name. When

  Control

  that was done, it was possible to obtain a number of Control objects including FrameRateControl and among others as follows:

  FrameGrabbingControl

  FrameGrabbingControl frameControl = (FrameGrabbingControl) player.getControl("javax.media.control.FrameGrabbingControl"); As in the previous case, these were only available

  Controls

  when the 's _ hint had been set to

  Manager PLUGIN PLAYER true .

  Thus, the surest and safest means of obtaining the appropriate objects and using them is with the following

  Control

  algorithm: If using a Player Set Manager's PLUGIN_PLAYER hint to true prior to creating the Player For each Controller needed Get it by name (full class name to getControl() method).

  If object returned not null "Use it" If want to provide direct user control Call getControlComponent() on Controller object If object returned is not null Add that to GUI else

  Implement own GUI interface

  Alpha

Animation Through Interpolators and Objects

  The Interpolator class extends Behavior that is used in conjunction with an Alpha object to provide animation to the scene graph. We use the term animation to refer to basically non-interactive changes to the scene graph.

  The name reflects the use of these classes in

  Interpolator

  interpolating between sets of values. We all remember having to interpolate tables and such in math class. This idea is the same. Often, the programmer will specify two values, and the

  

Interpolator will provide a smooth set of values between the

  pair. Other Interpolator s can have a variable number of points called knots to interpolate between.

  Note that s are almost always considered with

  Interpolator

  respect to an Alpha value. In rare cases, an Alpha object is used alone; however, the two are intended to be used together.

  The role of the object is to map time to the referencing

  Alpha Interpolator . In other words, the purpose of the Alpha

  object is to scale (normalize) time into a series of values between and

  1 and to map those values to other objects

  (especially s) as input. The scaled time values

  Interpolator are called the values.

  Alpha

  / pairings are generally used for ballistic

  Interpolator Alpha

  events; that is, events that are triggered and run to completion without further changes. The typical example of a ballistic event is the firing of a missile. Once the missile is launched, little can be done to change the course of events (modern guided missiles notwithstanding). Regardless, the general idea is to turn on the and let it go. We note, however,

  Interpolator

  that in Java 3D version 1.3, s do allow for

  Interpolator

  starting and stopping of their action through the and

  pause() methods.

  resume()

  Interpolator

Existing s

  Java 3D provides a number of prewritten s as

  Interpolator

  part of the base API in as well as

  javax.media.j3d.Behavior.Interpolator

  extensions of these as part of the utilities in . Note that

  com.sun.j3d.utils.behaviors.interpolators

  as of Java 3D v 1.3, there are new methods to pause and restart s. s used for rotating or

  Interpolator Interpolator

  translating objects in the scene graph extend the abstract class . The and

  TransformInterpolator ColorInterpolator classes still extend . SwitchInterpolator Interpolator

  Alpha

Specifying Objects

  To examine objects in more detail, consider some

  Alpha

  arbitrary function of time f(t) = 10*t. shows this very simple relationship that would have two columns, one containing the value of f(t) and the other containing discrete values of t.

  

Table 12.3. Sample Table for f(t) and Alpha Evaluated at Different Values of t

f(t) (seconds) t (seconds) Alpha

  0.0

  10

  1

  0.01

  20

  2

  0.02 … … …

  990 0.99 1000 100

  1.0 In this particular example, computation of the object is Alpha

  simple because the time points are equidistant and therefore could be easily specified in a with a

  Behavior wakeupOnElapsedTime() wakeup condition. Indeed, all the Interpolator s can be specified in this way.

  To understand the Alpha object, it is important to understand two mappings (input-output pairings) that are computed for each Alpha object. The first is termed the time-to-Alpha mapping and describes how the value, restricted to the

  Alpha

  range of 0-1 floating point, relates to time (which is, of course, continuous). The second mapping, Alpha-to-value, specifies how the Alpha values relate to changes in the Java 3D object (for example, , , or ).

  TransformGroup Color Switch

  Indeed, the class can be used to produce almost any

  Alpha

  time sequence. For example, the

  setPhaseDelayDuration()

  method is used to specify a period of time before the interpolation begins. A fixed (or infinite) number of loops can be specified using the method. (Note that

  setLoopCount()

  setting the to specifies infinite looping.)

  LoopCount -1

Specifying Knots

  Knots are much like key frames. In a complex sequence of movements, knots specify where the object is to be at different times in the sequence. In knots are used in conjunction with the to specify some interesting

  Interpolator movements of a cube through space. The important thing to remember when using knots is that the first and last knots in the sequence are 0.f and 1.f, respectively. Furthermore, a sequence of knots always progresses upward (for example, knot 2 is higher than knot 1, knot 3 is higher than knot 2, and so on).

  InterpolationPlotter

The Example

  Partly because there are so many options that can be specified in an object and further because we want to

  Alpha

  demonstrate the interaction between an object and its

  Alpha Interpolator , we provide an example program for

  experimentation. In the

  InterpolationPlotter application allows the user to create

  s and then instantiate one of several s to

  Alpha Interpolator see the results.

  Listing 12.16

InterpolationPlotter.java

  import java.applet.Applet; import java.awt.BorderLayout; import java.awt.event.*; import java.awt.GraphicsConfiguration; import com.sun.j3d.utils.applet.MainFrame; import com.sun.j3d.utils.geometry.Box; import com.sun.j3d.utils.geometry.ColorCube; import com.sun.j3d.utils.geometry.Primitive; import com.sun.j3d.utils.universe.*; import com.sun.j3d.utils.behaviors.mouse.MouseRotate; import javax.media.j3d.*; import javax.vecmath.*; import javax.swing.*; //import javax.swing.JScrollPane.ScrollBar; import java.util.Random; import javax.media.j3d.Text3D; import java.awt.*; import javax.swing.border.*; public class InterpolationPlotter extends JFrame { VirtualUniverse universe; Locale locale; TransformGroup vpTrans, geoTG; View view; Bounds bounds; InterpPlot2D ip2d; JButton newAlpha, newRandomInterp; Alpha a; Random r = new Random(); int n_knots; Quat4f[] quats; float[] knots; Point3f[] points; JRadioButton inc_alpha_enabledButton, dec_alpha_enabledButton, inc_dec_alpha_enabledButton; private String fontName = "TestFont"; BranchGroup scene; WholeNumberField phaseDelayDuration,triggerTime, increasingAlphaDuration, decreasingAlphaDuration, increasingAlphaRampDuration, decreasingAlphaRampDuration,

n_points, n_loops; public BranchGroup createSceneGraph(boolean newPath) { // Create the root of the branch graph; this will be returned Appearance app = new Appearance(); BranchGroup objRoot = new BranchGroup(); objRoot.setCapability(BranchGroup.ALLOW_DETACH); geoTG = new TransformGroup(); geoTG.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE); geoTG.setCapability(TransformGroup.ALLOW_TRANSFORM_READ); TransformGroup boxTG = new TransformGroup(); boxTG.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE); boxTG.setCapability(TransformGroup.ALLOW_TRANSFORM_READ); geoTG.addChild(boxTG); BoundingSphere bounds = new BoundingSphere(new Point3d(0.0,0.0,0.0), 1000.0); MouseRotate mouseBeh = new MouseRotate(geoTG); geoTG.addChild(mouseBeh); mouseBeh.setSchedulingBounds(bounds); PlotBehavior pb = new PlotBehavior(this); pb.setSchedulingBounds(bounds); objRoot.addChild(pb); Transform3D yAxis = new Transform3D();

knots[n_points.getValue()-1]=1.f; for (int ii=1; ii<n_points.getValue()-1; ii++) { knots[ii] = (float)ii/(float)(n_points.getValue()-1); System.out.println("ii: " + ii + " knots[ii]: " + knots[ii]); } if (newPath==true) { for (int ii=0; ii<n_points.getValue(); ii++) { quats[ii] = genRandomQuat(); points[ii] = genRandomPoint(); } } for (int ii=0; ii<n_points.getValue(); ii++) { String label = " " + ii; geoTG.addChild(MakeLabels(label,points[ii])); } . . .

  PointArray pathLine = new PointArray(n_points.getValue(),GeometryArray.

  COORDINATES); pathLine.setCoordinates(0, points); Shape3D path = new Shape3D(pathLine, app); geoTG.addChild(path); for (int ii=0; ii<n_knots; ii++) { String label = " " + ii; geoTG.addChild(MakeLabels(label,points[ii])); } RotPosPathInterpolator interp = boxTG, yAxis, knots, quats, points); PositionPathInterpolator interp = new PositionPathInterpolator(a, geoTG, yAxis, knots,points);

  • / interp.setSchedulingBounds(bounds); objRoot.addChild(interp); boxTG.addChild(new ColorCube(.8f)); objRoot.addChild(geoTG); Color3f blue = new Color3f(0.f, 0.9f, 0.f); Vector3f bluedir = new Vector3f(0.0f, -8.0f, -8.0f); // AmbientLight al = new AmbientLight(true, new Color3f(.1f,.9f, .1f)); AmbientLight al = new AmbientLight(); DirectionalLight bluelight = new DirectionalLight(blue, bluedir);

  objRoot.addChild(al); objRoot.addChild(bluelight); return objRoot; public BranchGroup createViewGraph() { BranchGroup objRoot = new BranchGroup(); Transform3D t = new Transform3D(); t.setTranslation(new Vector3f(0.0f, 0.2f,30.0f)); ViewPlatform vp = new ViewPlatform(); TransformGroup vpTrans = new TransformGroup(); vpTrans.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE); vpTrans.setCapability(TransformGroup.ALLOW_TRANSFORM_READ); vpTrans.setTransform(t); vpTrans.addChild(vp); view.attachViewPlatform(vp); NavigationBehavior nav = new NavigationBehavior(vpTrans); vpTrans.addChild(nav); nav.setSchedulingBounds(bounds); objRoot.addChild(vpTrans); return objRoot; } public TransformGroup MakeLabels(String s, Point3f p) { Font3D f3d = new Font3D(new Font(fontName, Font.PLAIN, 1), new FontExtrusion()); int sl = s.length(); Appearance apText = new Appearance();

  Text3D txt = new Text3D(f3d, s, p); Shape3D textShape = new Shape3D(); textShape.setGeometry(txt); textShape.setAppearance(apText); // Using billboard behavior on text3d TransformGroup bbTransPt = new TransformGroup(); bbTransPt.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE); bbTransPt.setCapability(TransformGroup.ALLOW_TRANSFORM_READ); Billboard bboardPt = new Billboard( bbTransPt ); geoTG.addChild( bboardPt ); BoundingSphere bounds = new BoundingSphere(new Point3d(0.0,0.0,0.0), 100.0); bboardPt.setSchedulingBounds( bounds ); bboardPt.setAlignmentMode(Billboard.ROTATE_ABOUT_POINT); Point3f rotationPt = new Point3f(0.0f, 5.0f, 0.0f); bboardPt.setRotationPoint(p); bbTransPt.addChild( textShape ); return bbTransPt; } public Point3f genRandomPoint() { float x = r.nextFloat()*15; float y = r.nextFloat()*15; float z = r.nextFloat()*15; Point3f p = new Point3f(x,y,z); // System.out.println("Point3f is made of x: " + x + " y: " + y + " z: " + z);

  } public Quat4f genRandomQuat() { float x = r.nextFloat()*200; float y = r.nextFloat()*200; float z = r.nextFloat()*200; float w = r.nextFloat()*1; Quat4f q = new Quat4f(x,y,z,w); // System.out.println("Quat4f is made of x: " + x + " y: " + y + " z: " + z + "w: "

  • w); return q; } public InterpolationPlotter() { super("Interpolation Plotter"); . . . universe = new VirtualUniverse(); locale = new Locale(universe); GraphicsConfigTemplate3D g3d = new GraphicsConfigTemplate3D(); GraphicsConfiguration gc= GraphicsEnvironment.getLocalGraphicsEnvironment(). getDefaultScreenDevice().getBestConfiguration(g3d);

  Canvas3D c = new Canvas3D(gc); . . . plotpanel.add(c); PhysicalBody body = new PhysicalBody(); PhysicalEnvironment environment = new PhysicalEnvironment(); view.addCanvas3D(c); view.setPhysicalBody(body); view.setPhysicalEnvironment(environment); // Create a simple scene and attach it to the virtual universe bounds = new BoundingSphere(new Point3d(0.0,0.0,0.0), 100.0); scene = createSceneGraph(true); BranchGroup vgraph = createViewGraph(); locale.addBranchGraph(vgraph); locale.addBranchGraph(scene); } public void newAlpha() { System.out.println("new Alpha"); a = new Alpha(-1, Alpha.INCREASING_ENABLE | Alpha.DECREASING_ENABLE, //Alpha.DECREASING_ENABLE, triggerTime.getValue(), phaseDelayDuration.getValue(), increasingAlphaDuration.getValue(), increasingAlphaRampDuration.getValue(), alphaAtOneDuration.getValue(), decreasingAlphaDuration.getValue(), decreasingAlphaRampDuration.getValue(), alphaAtZeroDuration.getValue()); } public void resetScene() {

ip2d.setNewAlpha(a); ip2d.refreshPlot(); scene = createSceneGraph(false); locale.addBranchGraph(scene); } public void newScene() { scene.detach(); quats = new Quat4f[n_points.getValue()]; knots = new float[n_points.getValue()]; points= new Point3f[n_points.getValue()]; scene = createSceneGraph(true); locale.addBranchGraph(scene); } . . .

Figure 12.9 shows the screen output for

  . The

  InterpolationPlotter InterpolationPlotter

  program allows for the adjustment of several parameters of the object to be manipulated. The program also allows for

  Alpha placement of the knots.

  Figure 12.9. Screen output for .

  InterpolationPlotter

Comprehensive Example: Kspace Visualization

  The following is a comprehensive example that will be integrated with some of Java 3D examples from

   to form

  one of the integrated examples at the end of the book. It isn't necessary to understand the subject matter to understand the graphics programming; however, some background information is helpful.

Background Information on Kspace

  One of the fundamental concepts in magnetic resonance imaging is kspace. It is sometimes said in jest that kspace is a place where no one can hear you scream. Without a doubt, kspace is a tough concept for people learning MRI. However, kspace is a simple extension of the concept of Fourier space that is well known in imaging. For now, suffice it to say that Fourier space and image space are different representations of the same data. It is possible to go back and forth between image space and Fourier space using the forward and inverse Fourier transform. Many image processing routines make direct use of this duality.

  What makes MRI different from other imaging modalities is that the data are collected directly in Fourier space by following a time-based trajectory through space. The position in Fourier space is directly related to the gradient across the object being imaged. By changing the gradient over time, many points in kspace can be sampled in a trajectory through Fourier space. Discreet samples are taken at each point until the Fourier space is filled. A backward (inverse) Fourier Transform is then performed to yield the image of the object (in image space). It turns out that there are numerous (almost infinite ways) to traverse kspace. One such trajectory is called spiral and because of the inherent properties of the circle. The spiral kspace trajectory allows for images to be taken in snapshot (20ms) fashion and is used for such leading-edge applications such as imaging of the beating heart or imaging brain activity (functional magnetic resonance imaging).shows the final trajectory. Blue dots indicate that all time points up to the current time, whereas red dots indicate the full trajectory. This part of the visualization will be integrated with Java 3D models in the KspaceModeller application developed in

   of this book.

Figure 3.13. Final kspace trajectory for KspacePlot.java.

In this example, we will use a slider bar to advance through time and see the current and previously sampled kspace points. The visualization will be made using an actual file that sits on

  the computer that runs an MRI scanner. We will later expand the number of kspace trajectories we can make, including making programmatic versions that aren't based on real data.

Step 1The Visualization

  First, let's make a skeleton of our external extended Canvas and call it . It is in this external class that all our

  KspaceCanvas

  painting will be accomplished. We will begin by painting an x- and y-axis with labels. An important part of this process is determining the height and width of the Canvas (accomplished with the method).

  getSize() Listing 3.7 KspaceCanvas.javaa Custom Canvas for Visualization

  import java.awt.*; import java.awt.geom.*; import java.awt.image.*; public class KspaceCanvas extends Canvas { int xcenter, ycenter, offsetx, offsety; KspaceCanvas() { System.out.println("Creating an instance of KspaceCanvas"); } public void drawKspace(int tpoint) { System.out.println("drawKspace"); this.tpoint = tpoint; repaint(); } public void paint(Graphics g){ Graphics2D g2d = (Graphics2D)g; //always! offsetx = (int) this.getSize().width/50; offsety = (int) this.getSize().height/50; xcenter = (int) this.getSize().width/2; ycenter = (int) this.getSize().height/2; //call method to paint the x-y axix paintAxis(g2d); } public void paintAxis(Graphics2D g2d) { //setup font for axis labeling Font f1 = new Font("TimesRoman", Font.BOLD, 14); g2d.setFont(f1); g2d.drawString("Kx",this.getSize().width-(2*offsetx), this.getSize().height/2); g2d.drawString("Ky",this.getSize().width/2,offsetx); // draw axis for kspace g2d.setColor(Color.black); //set rendering attribute g2d.drawLine(offsetx, ycenter, xcenter-xoffset, ycenter-yoffset); g2d.drawLine(xcenter, yoffset, xcenter, ycenter-yoffset); } } Notice the method that receives the

  drawKspace() int

  argument , whose sole function is to set the current

  tpoint

  timepoint and call repaint() . The tpoint variable will represent the current timepoint in the kspace trajectory and will run from 0 to 5499. In the next step, we will create a slider bar with an attached listener. In that listener, the

  drawKspace()

  method will be called with the tpoint variable representing the current value of the slider bar.

  The other important code for this part occurs in the overridden and methods. First, in the

  paint() paintAxis() paint()

  method, we get information necessary for scaling the axis appropriately. We want the x and y centers located half way up the Canvas regardless of size, so we use the

  getSize() method that was briefly described previously.

Step 2Setting Up the User Interface

  This step is fairly easy. We will extend and add two

  JFrame JPanels , one to hold the user interface components and

  another to hold our , as shown in

  Canvas Listing 3.8 KspaceSpacePlot.java

  import java.applet.Applet; import java.awt.*; import java.awt.event.*; import javax.vecmath.*; import javax.swing.*; import java.awt.BorderLayout; import java.util.Vector; import java.awt.GraphicsConfiguration; public class KspacePlot extends JFrame { int kwidth, kheight; KspaceData ks; KspaceCanvas kspacec; public KspacePlot (int initx, int inity) { super("Kspace Plot"); Panel sliderpanel = new Panel(); JSlider hslider = new JSlider(); hslider.setMinimum(0); hslider.setMaximum(5498); hslider.setValue(0); hslider.setPreferredSize(new Dimension(300,20)); sliderpanel.add(hslider); BorderLayout f1 = new BorderLayout(); this.getContentPane().setLayout(f1); this.getContentPane().add(sliderpanel, BorderLayout.NORTH); ks = new KspaceData(); kspacec = new KspaceCanvasA(); this.getContentPane().add(kspacec); kspacec.drawKspace(0); } public static void main(String[] args) { int initialSizex=500; int initialSizey=500; WindowListener l = new WindowAdapter() {

  {System.exit(0);} public void windowClosed(WindowEvent e) {System.exit(0);} }; KspacePlot k = new KspacePlot(initialSizex, initialSizey); k.addWindowListener(l); k.setSize(500,500); k.setVisible(true); } } class SliderListener implements ChangeListener { KspaceCanvasA kc; int slider_val; public SliderListener(KspaceCanvasA kc) { this.kc = kc; } public void stateChanged(ChangeEvent e) { JSlider s1 = (JSlider)e.getSource(); slider_val = s1.getValue(); System.out.println("Current value of the slider: " + slider_val); kc.drawKspace( }//End of stateChanged }//End of SliderListener

Step 3Reading the Scanner Trajectory

  Because we want to plot an actual trajectory from the scanner, we will read bytes from a file into an array using a

  

FileInputStream object. There are a few examples that read

  raw data in this book and you will find that, in practice, reading bytes is occasionally necessary. It probably isn't necessary to get into the details of this class. The important concept is that two files ( and ) are read into two

  kxtrap.flt kytrap.flt

  public arrays. Maximum and minimum values are computed for each array and stored in public variables. Another public array of time values is calculated based on the number of timepoints. The values stored in these arrays will be made accessible to by adding code to the

  KspaceCanvas KspaceCanvas constructor in " .

  " Note that this class is external and stored in the file (see ).

  KspaceData.java

Listing 3.9 KspaceData.javaan External Class for Reading

a Scanner Trajectory

  import java.awt.*; import java.io.*; class KspaceData { int tpoints=5500; double Gxcoord[] = new double[tpoints]; double Gycoord[] = new double[tpoints]; int time[] = new int[tpoints]; double Gx, Gy; double Gxmax=0.0; double Gymax=0.0; double Gxmin=0.0; double Gymin=0.0; KspaceData() { readdata(); } public void readdata() { try { DataInputStream disx = new DataInputStream(new FileInputStream("kxtrap.flt")); DataInputStream disy = new DataInputStream(new FileInputStream("kytrap.flt")); try { for (int i=0; i < tpoints-1; i++) { Gx = disx.readFloat(); Gy = disy.readFloat(); //find min and max gradient values to determine scaling if (Gx > Gxmax) { Gxmax=Gx; } if (Gy > Gymax) { Gymax=Gy; } if (Gx < Gxmin) { Gxmin=Gx; }

  Gymin=Gy; } Gycoord[i] = Gy; Gxcoord[i] = Gx; } } catch (EOFException e) { } } catch (FileNotFoundException e) { System.out.println("DataIOTest: " + e); } catch (IOException e) { System.out.println("DataIOTest: " + e); } } }

  KspaceCanvas