Skip to content
This repository was archived by the owner on Apr 24, 2021. It is now read-only.

Files

Latest commit

b44dfe0 · Apr 2, 2016

History

History

ide

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
Sep 6, 2014
Jul 7, 2014
Sep 23, 2014
Feb 29, 2016
Apr 2, 2016
Aug 10, 2015
Aug 10, 2015
Jun 30, 2014
Feb 22, 2016
Sep 22, 2014
Nov 26, 2013
Feb 3, 2014
Jan 2, 2016

README.md

Chrome Dev Editor

A Chrome App-based development environment.

Build Status

Requirements

Dart IDE needs to be installed and dart/dart-sdk/bin needs to be accessible from $PATH. In addition, you should set a DART_SDK environment variable and point it to your/path/to/dart-sdk.

We're currently developing against the weekly development release of the Dart SDK.

Entry Point

The main entry point to the Chrome App is app/manifest.json. It defines the background script for the application (app/background.js). This script gets invoked when the application starts. It opens a new window with the contents set to the app/spark_polymer.html file. This file in turn runs app/spark_polymer.dart.

Dependencies

Dependencies need to be fetched first, using pub. Run:

pub get

Packages

Chrome apps do not like symlinks. We use pub and a pubspec.yaml to provision our package dependencies, but we then physically copy all the packages into the app/packages directory. This is not a normal symlinked pub directory but has the same layout as one.

Run:

./grind setup

to copy library code from packages/ to app/packages/.

Lib

All the Dart code for the application (modulo the spark_polymer.* entry point and spark_polymer_ui.* top-level UI) lives in the app/lib directory.

Tests

All the tests live in app/test. These are standard Dart unit tests. Generally, one library under test == 1 test file, and they should all be referenced from all.dart.

Run

./grind mode-test

to switch the app over to including tests (the default mode).

More about the testing story here.

Getting Code, Development and Contributing

Contributions are welcome! See our Wiki for details on how to get the code, run, debug and build Spark, and contribute the code back.