Notes: |
Release Name: 1.5
Notes:
Summary
XNAT 1.5 included a variety of improvements with a strong focus on
the
ability to import data into an XNAT server. It addressed several
common
complaints with XNAT:
- Ability to archive and access Multiframe DICOM data
- Multiframe
data is now supported by XNAT. We are still fine tuning how some of
the
specific headers should be persisted, but this will be an
ongoing
process.
- Improve performance of the Uploader Applet
- The
uploader applet has been refactored to send DICOM data over
HTTP
(rather than the DICOM protocol). It uploads data scan by scan as
zips
to improve reliability.
- Rewrite the Prearchive management UI
- The Prearchive UI has been completely rewritten to support
sorting, filtering and bulk actions.
- Add Restful api for managing the preachive contents
- The prearchive is now accessible and manageable via the REST
api.
- Allow merging of new content into existing sessions
- Users
should be able to upload additional data to existing sessions,
without
eliminating pre-existing data. This includes adding new scans
to
existing sessions, and new files to existing scand.
- Incorporate DICOM server functionality into XNAT itself
- The XNAT server should support the receipt and storage of DICOM
data sent via DICOM C-STORE.
- Improve installation customizability
- Admins can now add installation specific libraries and servlets
without modifying the XNAT source content.
- Add a user cache space for temporary storage of user files (via
REST)
- Users can now upload data into a user specific cache space via
REST, for usage later on.
- Introduced services to the XNAT RESTful API.
Changes:
Detailed Breakdown
- Multiframe Support
- Support multiple frames in a single file
- Separation of session building from scan catalog construction
- Complete scan catalog builder- include attrdef input / attrval
output
- Rewrite scan builder to use catalog builder
- Functional/integration testing
- New attribute extraction type in-sequence element
- New attribute extraction type: chained conditional
- Determine attributes for multiframe MR
- Functional integration testing
- Refactor java attribute specification
- Port Imagescandata to use chained conditionals
- Port Imagesessiondata
- Port MRScandata and MRSessiondata
- Port CTScandata and CTSessiondata
- Port PETScandata and PETSessiondata
- Port all other types
- Use existing xml/catalog structure
- Upload Applet (HTTP/UI)
- Improve UI to support additional parameters and subject
creation with demographics
- Clean up UI
- Modify file transfer logic to go over HTTP (Kevin)
- Add threaded upload, scan-by-scan
- Protocol Validation Engine (This is being released on an
experimental basis)
- Create validation assessment for storing automated protocol
validation reports
- Implement validation mechanism (using Schematron)
- Prearchive UI
- Support viewing, sorting and filtering on large numbers of rows
(50,000)
- Support bulk actions
- call bulk action process
- support updating rows as actions are performed
- Add Reset feature to rebuild session xml
- Support thousands of entries
- Prearchive API
- Database/Cache Representation
- Unit testing
- Cache should be initialized before first request
- GET
(/prearchive/projects/{PROJECT_ID}/timestamp/SESSION?format=zip,html,xml)
- Support downloading the Files of the prearchive directory as a
ZIP
- Support downloading session xml
- Support downloading session summary html
- DELETE (/prearchive/projects/{PROJECT_ID}/timestamp/SESSION)
(Support deleting a session from the prearchive directory)
- Should MoveToCache
- Delete row from cache table : Could just call the refresh
operation on that uri
- Support URL based notation for referencing a session
- Prearchive wide Refresh (total purge/repopulation of cache
table)
- Session level Refresh (cache or xml)
- Set cache status (manually) via REST or java api
- Trigger re-determination of caches status (rebuild session
object) via REST or java api
- Rebuild session xml via REST or java api
- Support moving sessions from one prearchive project to
another
- Bulk Operations
- Delete
POST
/services/prearchive/delete?target=/prearchive/projects/X/timestamp/SESSION1,/prearchive/projects/X/timestamp/SESSION2
- Archive POST
/services/archive?src=/prearchive/projects/X/timestamp/SESSION1,/prearchive/projects/X/timestamp/SESSION2
- Move
POST
/services/prearchive/move?dest=PROJECT2&src=/prearchive/projects/X/timestamp/SESSION1,/prearchive/projects/X/timestamp/SESSION2
- Update DICOM Server to update the cache
- Merge/Upload API
- Transactional Import (uploads files and updates session as one
transaction)
- Similar to existing Upload Images
- Add support for adding scans to existing sessions
- Requires merging with existing session xml
- Add support for adding files to existing scans
- Parse existing scan catalog
- Parse new scan catalog
- Compare two for duplicates (overwrite=true/false)
- If safe,
- Copy files in, update catalog
- Destination: Prearchive
- Update cache to receiving
- Put files into new or existing session directory
- Run Restructurer
- Merge resulting directories into destination, using catalog
comparison
- --comment that duplicate instanceUID/diff classUID will be
ignored
- Run SessionBuilder
- Trigger cache refresh
- Destination: Archive
- Put files into temporary location (prearchive)
- Build XML
- Verify compatability with existing session xml (if present)
- Requires API for checking compatability
- Skip
compatability for now. Just add the new scans to the existing
session.
The probability of problems occurring is low, and should be
manageable
for manual intervention.
- If compatable,
- Update/store existing xml
- move files to archive
- make merge transactional
- If not, fail but leave in prearchive for admin followup.
-
- Gradual Import (uploads files/scans one at a time without
rebuilding session on each request) --COMPLETE
- Archive (involves gradually adding files and then running DCM
Refresh)
- Prearchive (42)
- Add files one at a time (or scan by scan) to an existing
prearchive directory
- Update cache to show the time of last file add and status of
receiving
- Support adding via specific session url
- Support adding via DICOM attributes
- Requires an API for pulling the PROJECT and the STUDY INSTANCE
UID
- Upload would put the file in a temporary directory (in
prearchive)
- Review the file to see where it should go
- Then do a rename to move the file to the proper location
- Response should include OK (depending on content-type of the
push)
- Add timer that rebuilds session xmls for sessions that are
receiving but inactive for X minutes.
- REST based Archive command
- /services/archive?src=/prearchive/projects/X/timestamp/session
- Should support additional parameters (subject, session,
etc)
- Set status to archiving
- Store XML
- If rename succeeds, then run AutoRun
- Else, run the full Transfer
- Update cache status as deleted
- just call the refresh operation on that prearchive session
level (which will discover that it is gone)
- URI as Identifier
- URI parser /data/project/x/subjects/y/experiments/z = SESSION
(can we somehow integrate this with restlet's parser)
- Process Monitoring
- Should be able to register listeners for processes that take a
while
- UI Modifications (existing pages should use new REST services)
- Upload Images page
- Archive page
- Tests
- DICOM Server Elimination
- Property configuration
- Project Mapping customization support
- Build simple DICOM receiver that would duplicate the Gradual
Prearchive Uploader
- REST Enhancements
- File filtering
- File list- indexed retrieval
- Duplicate session on a different installation -- via new XAR
support
- Improved Web-conf-lib support
- Support XDAT project specific modifications to the dependencies
xml
- Support XDAT project specific modifications to the web.xml
Updated REST API
Supported Prearchive URIs
/data/prearchive/ (GET,PUT(refresh))
/data/prearchive/projects (GET,PUT(refresh))
/data/prearchive/projects/[PROJECT_ID} (GET,PUT(refresh))
/data/prearchive/projects/{PROJECT_ID}/timestamp/SESSION (GET
(xml,html,zip),DELETE,PUT(refresh))
/data/prearchive/projects/{PROJECT_ID}/timestamp/SESSION/scans (GET
(xml,html,csv,json))
/data/prearchive/projects/{PROJECT_ID}/timestamp/SESSION/scans/SCAN/resources
(GET (xml,html,csv,json))
/data/prearchive/projects/{PROJECT_ID}/timestamp/SESSION/scans/SCAN/resources/RES/files
(GET (xml,html,csv,json))
/data/prearchive/projects/{PROJECT_ID}/timestamp/SESSION/scans/SCAN/resources/RES/files/FILENAME
(GET)
Supported Archive URIs
/data/archive/projects/ID
/data/archive/projects/ID/subjects/ID
/data/archive/projects/ID/subjects/ID/experiments/ID
/data/archive/projects/ID/subjects/ID/experiments/ID/scans/ID
/data/archive/projects/ID/subjects/ID/experiments/ID/reconstructions/ID
/data/archive/projects/ID/subjects/ID/experiments/ID/assessors/ID
/data/archive/experiments/ID
/data/archive/experiments/ID/scans/ID
/data/archive/experiments/ID/reconstructions/ID
/data/archive/experiments/ID/assessors/ID
... pre-existing child uris for the archive resources are supported
here as well.
Supported User URIs
/data/user/cache/resource (GET (xml,html,csv,json))
/data/user/cache/resource/RES (GET (xml,html,csv,json,zip), POST
(file), DELETE)
/data/user/cache/resource/RES/files (GET (xml,html,csv,json), POST
(file), DELETE)
/data/user/cache/resource/RES/files/X (GET, PUT (file), DELETE)
Support service URIs
/data/services/import
/data/services/archive
/data/services/prearchive/move
/data/services/prearchive/delete
For backwards compatibility, /REST can be used in place of /data
and the /archive level can be left off.
|
|