DDF Home
Welcome to the home of the DDF.
The Quick Start describes how to get up and running quickly.
Distributed Data Framework (DDF) is an agile and modular integration framework. It is primarily focused on data integration, enabling clients to insert, query and transform information from disparate data sources via the DDF Catalog. A Catalog API allows integrators to insert new capabilities at various stages throughout each operation. DDF is designed with several architectural qualities to benefit integrators:
Standardization
- Building on established Free and Open Source Software (FOSS) and open standards avoids vendor lock-in
Extensibility
- System Integrators can extend capabilities by developing and sharing new features
Flexibility
- System Integrators may deploy only those features required
Simplicity of installation and operation
- Unzip and run
- Configuration via a web console
- Simplicity of Development
- Build simple Plain Old Java Objects (POJOs) and wire them in via a choice of dependency injection frameworks
- Make use of widely available documentation and components for DDF's underlying technologies
- Modular development supports multi-organizational and multi-regional teams
Included Applications
- DDF Content Application
- DDF Content Application Users Guide — Application providing the Content framework implementation, REST endpoint, Directory Monitor, plugin to the Catalog Framework, and content repository file system storage.
- DDF Content Application Install/Uninstall
- DDF Content REST CRUD Endpoint — allows clients to perform CRUD operations on DDF Content using REST, a simple architectural style, over HTTP
- DDF Content Core
- DDF Content Framework
- DDF Content Application Release Notes
- DDF Content Application Security Guide
- DDF Content Application Users Guide — Application providing the Content framework implementation, REST endpoint, Directory Monitor, plugin to the Catalog Framework, and content repository file system storage.
- DDF Spatial Application
- DDF Spatial Application Users Guide — Application providing KML transformer and a KML network link endpoint that allows a user to generate a View-based KML Query Results Network Link.
- DDF Spatial Application Release Notes
- DDF Spatial Application Security Guide
- DDF Standard Search UI
- DDF Security Application
- DDF Security Application Users Guide — Application provides Authentication, Authorization, and Auditing services for the ApplicationName
- DDF Security Application Release Notes
- DDF Security Application Security Guide
- DDF Catalog Application
- DDF Catalog Application Users Guide — Describes the DDF Catalog application and available options for extending its capabilities
- DDF Catalog OpenSearch
- Federation — provides the capability to extend the DDF enterprise to include Remote Sources, which can include other instances of DDF
- Developing at the Framework Level — Help with framework concepts and development.
- Sources — connect Catalog components to data sources, both local and remote
- DDF Camel Components
- DDF Catalog Schematron
- Data Components — representations of data in the Catalog, primarily metadata represented as a Metacard
- Endpoints — components that accept external requests and interface with internal components, normalizing the request and denormalizing the response
- Developing Catalog Components — Describes how to create Catalog components. Used in conjunction with the Javadoc to begin extending the DDF Catalog.
- Catalog Plugins — process Catalog operations, generally before and after they are executed
- Resource Components — used to work with Resources, i.e., the data represented by the cataloged metadata
- DDF Catalog Core
- DDF Catalog Application Install/Uninstall
- Catalog Framework — the core of the Catalog application, routes requests and responses between all Catalog Components
- Eventing — allows endpoints (and thus external users) to create a "standing query" and be notified when a matching Metacard is created, updated, or deleted
- Operations — represent all transactions that occur in the Catalog, including requests and responses
- DDF Catalog REST
- Catalog Fanout Framework App — provides an implementation of the Catalog Framework that acts as a proxy, federating requests to all available sources
- Catalog Transformers — transform data to and from various formats
- DDF Catalog Application Release Notes
- DDF Catalog Application Security Guide
- DDF Catalog Application Users Guide — Describes the DDF Catalog application and available options for extending its capabilities
- DDF Solr Catalog Application
- DDF Solr Catalog Application Users Guide — Application providing an implementation of the CatalogProvider interface using Apache Solr http://lucene.apache.org/solr/ as a data store.
- Standalone Solr Server — an Apache Solr instance as a Catalog data store within the distribution
- DDF Catalog Solr External Provider
- Solr Catalog Provider Configurations (OLD) — implementation of a CatalogProvider using Apache Solr as the data store.
- DDF Catalog Solr Embedded Provider
- DDF Solr Catalog Application Release Notes
- DDF Solr Catalog Application Security Guide
- DDF Solr Catalog Application Users Guide — Application providing an implementation of the CatalogProvider interface using Apache Solr http://lucene.apache.org/solr/ as a data store.
- DDF Platform Application
- DDF Platform Application Users Guide — Application providing the fundamental building blocks that the DDF distribution needs in order to run, including subsets of Karaf, CXF, Cellar, and Camel.
- Security Core API
- Developing Action Components — Describes how and why to create Action Components.
- Platform Global Settings — The Platform Global Settings are the system wide configuration settings used throughout DDF
- DDF Platform Application Install/Uninstall
- Platform Status Service
- DDF Metrics
- DDF Mime Framework
- DDF Platform Application Release Notes
- DDF Platform Application Security Guide
- DDF Platform Application Users Guide — Application providing the fundamental building blocks that the DDF distribution needs in order to run, including subsets of Karaf, CXF, Cellar, and Camel.
Please visit the DDF Users Guide and the DDF Developer's Guide for more information.
Building
Prerequisites
* Install J2SE 7 SDK. The build is also compatible with JDK 6.0 Update 29 or later.
* Make sure that your JAVA_HOME environment variable is set to the newly installed JDK location, and that your PATH includes %JAVA_HOME%\bin (windows) or $JAVA_HOME$/bin (*nix).
* Install Maven 3.0.4 (or later). Make sure that your PATH includes the MVN_HOME/bin directory.
* In addition, access to a Maven repository with the latest project artifacts and dependencies is necessary in order for a successful build. The following sample settings.xml (the default settings file) can be used to access the public repositories with the required artifacts. For more help on how to use the settings.xml file, see the Maven settings reference page.
<settings> <!-- If proxy is needed <proxies> <proxy> </proxy> </proxies> --> </settings>
Handy Tip on Encrypting Passwords
See this Maven guide on how to encrypt the passwords in your settings.xml.
Procedures
Run the Build
In order to run through a full build, be sure to have a clone for all repositories in the same folder:
- ddf (https://github.com/codice/ddf.git)
- ddf-catalog (https://github.com/codice/ddf-catalog.git)
- ddf-content (https://github.com/codice/ddf-content.git)
- ddf-parent (https://github.com/codice/ddf-parent.git)
- ddf-platform (https://github.com/codice/ddf-platform.git)
- ddf-security (https://github.com/codice/ddf-security.git)
- ddf-solr (https://github.com/codice/ddf-solr.git)
- ddf-spatial (https://github.com/codice/ddf-spatial.git)
- ddf-support (https://github.com/codice/ddf-support.git)
- ddf-ui (https://github.com/codice/ddf-ui.git)
Build command example for one individual repository.
# Build is run from the top level of the specified repository in a command line prompt or terminal. cd ddf-support mvn clean install # At the end of the build, a BUILD SUCCESS will be displayed.
Build command example to build all repositories. Must be performed at the top level folder that contains all the repositories. A command list would look like this.
# Build is run from the top level folder that contains all the repositories in a command line prompt or terminal. cd ddf-support mvn clean install cd ../ddf-parent mvn clean install cd ../ddf-platform mvn clean install cd ../ddf-security mvn clean install cd ../ddf-catalog mvn clean install cd ../ddf-content mvn clean install cd ../ddf-spatial mvn clean install cd ../ddf-solr mvn clean install cd ../ddf-ui mvn clean install cd ../ddf mvn clean install # This will fully compile each individual app. From here you may hot deploy the necessary apps on top of the DDF Kernel.
To be able to use the updated apps in a DDF Distribution, you will have to update the versions referenced in the "ddf" repository.
The zip distribution of DDF is contained in the DDF app in the distribution/ddf/target directory after the DDF app is built
Also note that you may create a reactor pom that will allow you to do this whole build process by calling a build on one pom rather than all of them. This pom would have to reside in the top level folder that holds all the repositories. An example of the file would be:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>org.codice.ddf</groupId> <artifactId>reactor</artifactId> <version>1.0.0-SNAPSHOT</version> <packaging>pom</packaging> <name>DDF Reactor</name> <description>Distributed Data Framework (DDF) is an open source, modular integration framework</description> <modules> <module>ddf-support</module> <module>ddf-parent</module> <module>ddf-platform</module> <module>ddf-security</module> <module>ddf-catalog</module> <module>ddf-content</module> <module>ddf-spatial</module> <module>ddf-solr</module> <module>ddf-ui</module> <module>ddf</module> </modules> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-deploy-plugin</artifactId> <version>2.4</version> <configuration> <!-- Do not deploy the reactor pom --> <skip>true</skip> </configuration> </plugin> </plugins> </build> </project>
It usually takes some time for maven to download required dependencies in the first build. Build times may vary based on network speed and machine specifications.
In certain circumstances the build may fail due to a 'java.lang.OutOfMemory: Java heap space' error. This error is due to the large number of sub-modules in the DDF build causing the heap space to run out in the main maven JVM. To fix this issue, set the system variable MAVEN_OPTS with the value -Xmx512m -XX:MaxPermSize=256m before running the build.
Example on Linux system with the bash shell: export MAVEN_OPTS='-Xmx512m -XX:MaxPermSize=256m'
How to Run
* Unzip the distribution.
* Run the executable at <distribution_home>/bin/ddf.bat or <distribution_home>/bin/ddf
Additional Information
The wiki is the right place to find any documentation about DDF.
Discussions can be found on the Announcements forum, Users forum, and Developers forum.
For a
DDF
binary distribution, please read the release notes on the wiki for a list of supported and unsupported features.If you find any issues with
DDF
, please submit reports with JIRA: https://tools.codice.org/jira/browse/DDFFor information on contributing to DDF see: http://www.codice.org/contributing.
The Website contains additional information at http://ddf.codice.org
Many thanks for using
DDF
.-- The Codice
DDF
Development Team
Recent updates
There are no recent updates at this time.