Skip to content
This repository has been archived by the owner on Apr 22, 2023. It is now read-only.

Commit

Permalink
Browse files Browse the repository at this point in the history
Merge remote-tracking branch 'ry/v0.10' into v0.10-merge
Conflicts:
	AUTHORS
	ChangeLog
	deps/uv/ChangeLog
	deps/uv/config-unix.mk
	deps/uv/src/unix/stream.c
	deps/uv/src/version.c
	deps/uv/uv.gyp
	src/node.cc
	src/node_buffer.cc
	src/node_crypto.cc
	src/node_version.h
	src/stream_wrap.cc
	src/stream_wrap.h
  • Loading branch information
isaacs committed May 17, 2013
2 parents 7998843 + f59ab10 commit e59141e
Show file tree
Hide file tree
Showing 32 changed files with 1,378 additions and 891 deletions.
1 change: 1 addition & 0 deletions AUTHORS
Expand Up @@ -451,4 +451,5 @@ Sam Roberts <vieuxtech@gmail.com>
Kevin Locke <kevin@kevinlocke.name>
Daniel Moore <polaris@northhorizon.net>
Robert Kowalski <rok@kowalski.gd>
Benoit Vallée <github@benoitvallee.net>
Nick Sullivan <nick@sullivanflock.com>
3 changes: 2 additions & 1 deletion CONTRIBUTING.md
Expand Up @@ -91,7 +91,8 @@ nicely even when it is indented.
The header line should be meaningful; it is what other people see when they
run `git shortlog` or `git log --oneline`.

Have a look at `git log` for inspiration.
Check the output of `git log --oneline files_that_you_changed` to find out
what subsystem (or subsystems) your changes touch.


### REBASE
Expand Down
13 changes: 13 additions & 0 deletions ChangeLog
Expand Up @@ -60,6 +60,19 @@
* zlib: allow passing options to convenience methods (Kyle Robinson Young)


2013.05.14, Version 0.10.6 (Stable), 5deb1672f2b5794f8be19498a425ea4dc0b0711f

* module: Deprecate require.extensions (isaacs)

* stream: make Readable.wrap support objectMode, empty streams (Daniel Moore)

* child_process: fix handle delivery (Ben Noordhuis)

* crypto: Fix performance regression (isaacs)

* src: DRY string encoding/decoding (isaacs)


2013.04.23, Version 0.10.5 (Stable), deeaf8fab978e3cadb364e46fb32dafdebe5f095

* uv: Upgrade to 0.10.5 (isaacs)
Expand Down
4 changes: 1 addition & 3 deletions benchmark/crypto/cipher-stream.js
Expand Up @@ -16,8 +16,6 @@ function main(conf) {
api = 'legacy';
}

var dur = conf.dur;

var crypto = require('crypto');
var assert = require('assert');
var alice = crypto.getDiffieHellman('modp5');
Expand Down Expand Up @@ -73,7 +71,7 @@ function streamWrite(alice, bob, message, encoding, writes) {
bob.on('end', function() {
// Gbits
var bits = written * 8;
var gbits = written / (1024 * 1024 * 1024);
var gbits = bits / (1024 * 1024 * 1024);
bench.end(gbits);
});

Expand Down
86 changes: 86 additions & 0 deletions benchmark/crypto/hash-stream-creation.js
@@ -0,0 +1,86 @@
// throughput benchmark
// creates a single hasher, then pushes a bunch of data through it
var common = require('../common.js');
var crypto = require('crypto');

var bench = common.createBenchmark(main, {
writes: [500],
algo: [ 'sha256', 'md5' ],
type: ['asc', 'utf', 'buf'],
out: ['hex', 'binary', 'buffer'],
len: [2, 1024, 102400, 1024 * 1024],
api: ['legacy', 'stream']
});

function main(conf) {
var api = conf.api;
if (api === 'stream' && process.version.match(/^v0\.[0-8]\./)) {
console.error('Crypto streams not available until v0.10');
// use the legacy, just so that we can compare them.
api = 'legacy';
}

var crypto = require('crypto');
var assert = require('assert');

var message;
var encoding;
switch (conf.type) {
case 'asc':
message = new Array(conf.len + 1).join('a');
encoding = 'ascii';
break;
case 'utf':
message = new Array(conf.len / 2 + 1).join('ü');
encoding = 'utf8';
break;
case 'buf':
message = new Buffer(conf.len);
message.fill('b');
break;
default:
throw new Error('unknown message type: ' + conf.type);
}

var fn = api === 'stream' ? streamWrite : legacyWrite;

bench.start();
fn(conf.algo, message, encoding, conf.writes, conf.len, conf.out);
}

function legacyWrite(algo, message, encoding, writes, len, outEnc) {
var written = writes * len;
var bits = written * 8;
var gbits = bits / (1024 * 1024 * 1024);

while (writes-- > 0) {
var h = crypto.createHash(algo);
h.update(message, encoding);
var res = h.digest(outEnc);

// include buffer creation costs for older versions
if (outEnc === 'buffer' && typeof res === 'string')
res = new Buffer(res, 'binary');
}

bench.end(gbits);
}

function streamWrite(algo, message, encoding, writes, len, outEnc) {
var written = writes * len;
var bits = written * 8;
var gbits = bits / (1024 * 1024 * 1024);

while (writes-- > 0) {
var h = crypto.createHash(algo);

if (outEnc !== 'buffer')
h.setEncoding(outEnc);

h.write(message, encoding);
h.end();
h.read();
}

bench.end(gbits);
}
77 changes: 77 additions & 0 deletions benchmark/crypto/hash-stream-throughput.js
@@ -0,0 +1,77 @@
// throughput benchmark
// creates a single hasher, then pushes a bunch of data through it
var common = require('../common.js');
var crypto = require('crypto');

var bench = common.createBenchmark(main, {
writes: [500],
algo: [ 'sha256', 'md5' ],
type: ['asc', 'utf', 'buf'],
len: [2, 1024, 102400, 1024 * 1024],
api: ['legacy', 'stream']
});

function main(conf) {
var api = conf.api;
if (api === 'stream' && process.version.match(/^v0\.[0-8]\./)) {
console.error('Crypto streams not available until v0.10');
// use the legacy, just so that we can compare them.
api = 'legacy';
}

var crypto = require('crypto');
var assert = require('assert');

var message;
var encoding;
switch (conf.type) {
case 'asc':
message = new Array(conf.len + 1).join('a');
encoding = 'ascii';
break;
case 'utf':
message = new Array(conf.len / 2 + 1).join('ü');
encoding = 'utf8';
break;
case 'buf':
message = new Buffer(conf.len);
message.fill('b');
break;
default:
throw new Error('unknown message type: ' + conf.type);
}

var fn = api === 'stream' ? streamWrite : legacyWrite;

bench.start();
fn(conf.algo, message, encoding, conf.writes, conf.len);
}

function legacyWrite(algo, message, encoding, writes, len) {
var written = writes * len;
var bits = written * 8;
var gbits = bits / (1024 * 1024 * 1024);
var h = crypto.createHash(algo);

while (writes-- > 0)
h.update(message, encoding);

h.digest();

bench.end(gbits);
}

function streamWrite(algo, message, encoding, writes, len) {
var written = writes * len;
var bits = written * 8;
var gbits = bits / (1024 * 1024 * 1024);
var h = crypto.createHash(algo);

while (writes-- > 0)
h.write(message, encoding);

h.end();
h.read();

bench.end(gbits);
}
6 changes: 3 additions & 3 deletions doc/api/fs.markdown
Expand Up @@ -686,7 +686,7 @@ An example to read the last 10 bytes of a file which is 100 bytes long:

## Class: fs.ReadStream

`ReadStream` is a [Readable Stream](stream.html#stream_readable_stream).
`ReadStream` is a [Readable Stream](stream.html#stream_class_stream_readable).

### Event: 'open'

Expand All @@ -710,9 +710,9 @@ some position past the beginning of the file. Modifying a file rather
than replacing it may require a `flags` mode of `r+` rather than the
default mode `w`.

## fs.WriteStream
## Class: fs.WriteStream

`WriteStream` is a [Writable Stream](stream.html#stream_writable_stream).
`WriteStream` is a [Writable Stream](stream.html#stream_class_stream_writable).

### Event: 'open'

Expand Down
7 changes: 4 additions & 3 deletions doc/api/globals.markdown
Expand Up @@ -133,9 +133,10 @@ See the [module system documentation][] for more information.

<!-- type=var -->

An object which is shared between all instances of the current module and
made accessible through `require()`.
`exports` is the same as the `module.exports` object.
A reference to the `module.exports` object which is shared between all
instances of the current module and made accessible through `require()`.
See [module system documentation][] for details on when to use `exports` and
when to use `module.exports`.
`exports` isn't actually a global but rather local to each module.

See the [module system documentation][] for more information.
Expand Down
24 changes: 19 additions & 5 deletions doc/api/modules.markdown
Expand Up @@ -30,6 +30,20 @@ The module `circle.js` has exported the functions `area()` and
`circumference()`. To export an object, add to the special `exports`
object.

Note that `exports` is a reference to `module.exports` making it suitable
for augmentation only. If you are exporting a single item such as a
constructor you will want to use `module.exports` directly instead.

function MyConstructor (opts) {
//...
}

// BROKEN: Does not modify exports
exports = MyConstructor;

// exports the constructor properly
module.exports = MyConstructor;

Variables
local to the module will be private. In this example the variable `PI` is
private to `circle.js`.
Expand Down Expand Up @@ -73,7 +87,7 @@ Consider this situation:
When `main.js` loads `a.js`, then `a.js` in turn loads `b.js`. At that
point, `b.js` tries to load `a.js`. In order to prevent an infinite
loop an **unfinished copy** of the `a.js` exports object is returned to the
`b.js` module. `b.js` then finishes loading, and its exports object is
`b.js` module. `b.js` then finishes loading, and its `exports` object is
provided to the `a.js` module.

By the time `main.js` has loaded both modules, they're both finished.
Expand Down Expand Up @@ -219,14 +233,14 @@ would resolve to different files.

In each module, the `module` free variable is a reference to the object
representing the current module. In particular
`module.exports` is the same as the `exports` object.
`module.exports` is accessible via the `exports` module-global.
`module` isn't actually a global but rather local to each module.

### module.exports

* {Object}

The `exports` object is created by the Module system. Sometimes this is not
The `module.exports` object is created by the Module system. Sometimes this is not
acceptable, many want their module to be an instance of some class. To do this
assign the desired export object to `module.exports`. For example suppose we
were making a module called `a.js`
Expand Down Expand Up @@ -267,13 +281,13 @@ y.js:
### module.require(id)

* `id` {String}
* Return: {Object} `exports` from the resolved module
* Return: {Object} `module.exports` from the resolved module

The `module.require` method provides a way to load a module as if
`require()` was called from the original module.

Note that in order to do this, you must get a reference to the `module`
object. Since `require()` returns the `exports`, and the `module` is
object. Since `require()` returns the `module.exports`, and the `module` is
typically *only* available within a specific module's code, it must be
explicitly exported in order to be used.

Expand Down
6 changes: 4 additions & 2 deletions doc/api/stream.markdown
Expand Up @@ -131,13 +131,15 @@ TLS, may ignore this argument, and simply provide data whenever it
becomes available. There is no need, for example to "wait" until
`size` bytes are available before calling `stream.push(chunk)`.

### readable.push(chunk)
### readable.push(chunk, [encoding])

* `chunk` {Buffer | null | String} Chunk of data to push into the read queue
* `encoding` {String} Encoding of String chunks. Must be a valid
Buffer encoding, such as `'utf8'` or `'ascii'`
* return {Boolean} Whether or not more pushes should be performed

Note: **This function should be called by Readable implementors, NOT
by consumers of Readable subclasses.** The `_read()` function will not
by consumers of Readable streams.** The `_read()` function will not
be called again until at least one `push(chunk)` call is made. If no
data is available, then you MAY call `push('')` (an empty string) to
allow a future `_read` call, without adding any data to the queue.
Expand Down

0 comments on commit e59141e

Please sign in to comment.