| Age | Commit message (Collapse) | Author |
|
This change updates the WTF trace reader to support the new streaming
trace API.
|
|
This change updates the SWF trace reader to support the new streaming
trace API.
|
|
This change moves Bitbrains trace support into a separate module and
adds support for the new trace api.
|
|
This change removes the environment reader from the format library since
they are highly specific for the particular experiment. In the future,
we hope to have a single format to setup the entire datacenter (perhaps
similar to the format used by the web runner).
|
|
This change extracts the Parquet helpers outside format module into a
new module, in order to improve re-usability of these helpers.
|
|
This change starts the process of moving the different trace formats into
separate modules. This change in particular moves the GWF trace format
into a new module, opendc-trace-gwf.
Furthermore, this change also implements the trace API for the GWF
module.
|
|
This change updates the code for the Bitbrains trace reader and upgrades
the TraceConverter to re-use existing code of the Bitbrains trace
reader.
|
|
This change refactors the trace workload in the OpenDC simulator to
track execute a fragment based on the fragment's timestamp. This makes
sure that the trace is replayed identically to the original execution.
|
|
This change updates reimplements the performance interference model to
work on top of the universal resource model in
`opendc-simulator-resources`. This enables us to model interference and
performance variability of other resources such as disk or network in
the future.
|
|
This change updates the trace reader implementation to remove their
dependency on the performance interference model. In a future commit, we
will instead pass the performance interference model via the
host/hypervisor.
|
|
This change re-organizes the classes of the compute simulator module to
make a clearer distinction between the hardware, firmware and software
interfaces in this module.
|
|
This change fixes an issue where the power in the energy experiments is
always reported as zero due to the changes in commit 652b869.
|
|
This change eliminates all Hadoop dependencies that are not necessary
for Parquet to work correctly. As a result, the number of dependencies
should now be greatly reduced, which in turn leads to less artifacts
that need to be retrieved at build time.
|
|
This change adds an implementation of Parquet OutputFile for local files
in order to eliminate the dependency on the entire Hadoop system. This
implementation allows users to read Parquet files locally without
needing a Parquet filesystem implementation.
|
|
This change updates the Parquet readers used in the Capelin experiments
to use our InputFile implementation for local files, to reduce our
dependency on Apache Hadoop.
|
|
This change updates the format implementations that use Parquet by
switching to our InputFile implementation for local files, which
eliminates the need for Hadoop's filesystem support.
|
|
This change adds an implementation of Parquet's local InputFile in order
to eliminate the dependency on the entire Hadoop system. This
implementation allows users to read Parquet files locally without
needing a Parquet filesystem implementation.
|
|
This change fixes the SLF4J logging warnings that occur during the
tests.
|
|
This change addresses the deprecations that were caused by the migration
to Kotlin 1.5.
|
|
|
|
This change adds support for the Gradle version catalog feature in our
build configuration. This allows us to have a single file,
gradle/libs.versions.toml, which contains all the dependency versions
used in this project.
|
|
This change updates the build scripts to use type-safe project accessors
when specifying build dependencies between modules.
|
|
This change updates the project structure to become flattened.
Previously, the simulator, frontend and API each lived into their own
directory.
With this change, all modules of the project live in the top-level
directory of the repository. This should improve discoverability of
modules of the project.
|