DASH File Format Specification and File
Intercommunication Architecture. Revision 2.1.


Quick Summary

Innovation is not a one time event. It is a continuous acquisitional process that must be embedded into every part of the fabric of a wider mechanism. One of the most common and most salient misconceptions is that innovation is primarily, if not exclusively, just about changing or making breakthroughs in technology. Not only is this thoroughly incorrect, but it also acts to detrimentally dampen the total addressable horizon of the innovation potential. Technology alone is not the fundamental engine, the configuration of the business model and its functions are just as important, and in most cases, what work to maintain emphasis on the technology's unique value.

DASH innovates on both a computational and business technology layer, whereby the innovation is defined from the ability that is enabled by the full model and its requirements— new value creation, sustainable ecosystem complexities and automated, real time iteration cycles. DASH is a file format built for the metaverse and beyond. It is solving information transfer and interoperability the right way, directly embedding the application layer directly into the file layer through elegant usage of some of the most beautiful math ever recorded by humanity.

In this revision of DASH, we look more explicitly at how the technological layer of DASH is further transformed, incentivised and innovated through the connected business blueprint.



Figure 1:DASH Service Design Blueprint




Business Function Service Design Blueprint Overview and Key Features

The source file (i.e. ZPAC) is first loaded into the user interface by the Creator, through a simple drag and drop upload, where key information is stored in the Optimistic Transform Ledger.

Pre-processing of the source file then occurs; an initial ternary operator check of the source’s perceived classification validity and then material instance segmentation across the model’s coordinate data, for effectively detecting and delineating each distinct material region of interest so that a fidelity index can be gauged.

From here, the Creator is guided through an estimation flow, allowing for the system to comprehend, with higher reliability, the needs and requirements of the particular Creator. The work achieved through pre-processing outputs a Creative Friction / Pain Reduction Potential score for the Creator, related to the cost/time expense (%) that can be overcome if the source file is transformed to DASH before application layer deployment. Considerate insight for the cost/time expensive (%) estimation is derived from the communication of the Transformation Set Progression with the Engine Plugin and so the specific output application layer environment. Here, the Creator has optionality across 3 provided tiers (Low, Medium, High) and 1 custom category of coverage intensity i.e. how loyal do they wish DASH to be upon output to the specific graphic environment.

The threadline fidelity index is then scored for the Creator, providing a measure of the interoperability intensity requirement, in relation to the Creator’s chosen 1 of 3 tiered coverage intensities (Low, Medium, High). A validation check is performed with the Creator, as the transposability score is weighed in against the Creator’s initial determination of the best cost/time expense saving.

The thread cost to the Creator for choosing the specified coverage intensity and approving the affinity of the fidelity index score is made apparent and labelled across 3 intentional headers; good deal, normal, expensive. The thread cost is determined from the Creator's inputs, aligning directly with their preference against making their own pipeline and creative work channel more streamlined. The thread cost will be quoted in a natively deployed cryptocurrency utility token, that will allow for the customised DASH architecture incentive and value transfer mechanism to maintain liquid sustainability.

Upon approval and value transfer by the Creator, the pre-processed source file is then fed through the Transformation Set, where limits on the number of dimensional transforms performed are set according to the Creator’s verified choice across good deal, normal or expensive. The more transforms that are applied to the pre-processed source information, and so the thoroughness of the information space coverage, the higher the transposability score, thread cost and also affinity to reduction loss and compatibility with the application environment.

From here, post-processing of the now transformed source file takes place, as it is prepared for hand off to the plugin integrator and assembled for export as .DASH. The engine plugin licenses .DASH to be read and recognised correctly by the engine. Run-time analysis and performance checks are then performed across the object, scene, project and engine, with the plugin accounting for the potential re-optimisations required to increase affinity of the 3D object within the graphic environment i.e. should the file be run through the transform set again, and according to what recommended customised transform parameters. Here, the Creator can choose to run the analysis on either a continuous, periodic or manually targeted basis— where the native crypto utility token provides the necessary liquidity for the analysis and performance checks to continue.

Open Source Community Incentive Network

The most salient mantra and embedded conviction in the DASH architecture is the injection of the spirit of Open Source. Especially that of building an incentivised developer and research driven ecosystem that allows for differing functions, models and datasets to be proposed and incorporated into the dynamic and swappable segments of the Transformation Set. This ensures continuous and viable upgrades and iterations can be made for more refined optimisation.

Here, customisations to the applied functions used in the Transform Set can be proposed by Researcher and Developer optimists within the community. Their models are first given an internal claim expectation before being submitted for consideration as a customisation option for the Creator i.e. swappable into the Transformation Set if chosen by the Creator as they are guided through the estimation flow.

The Researcher and Developer Optimists must stake the native utility token on their model as a proof of good actor validation. If their model is then chosen, through a Quadratic voting and liquidity nomination mechanism, by other members of the community, their model can then be publicly used within the broader DASH architecture.

For providing work to the ecosystem, the community optimist, that developed the model/function set, is rewarded in the same native token for their authenticated contribution— the staked amount gains yield on winning transform plays and is burned against losing ones. The system is further prevented from being gamed by whale actors, where the right is reserved to apply a GAN to randomly simulate the models at no additional cost, where here if negative anomalies are identified, the staked amount is slashed and provided back to the network.