NodeNets

NodeNets

NodeNet(bin_im)

Parameters: bin_im: binary mask. the object voxel should be false, non-object voxel should be true Return: nodeNet object

source
NodeNet( points; penalty_fn = alexs_penalty )

Perform the teasar algorithm on the passed Nxd array of points

source
NodeNet( seg, obj_id; penalty_fn=alexs_penalty)

Perform the teasar algorithm on the passed binary array.

source
alexs_penalty( weights, dbf )

Returns a version of the edge weights, modified by the DBF-based teasar penalty:

w = w * 5000 .* (1 - dbf/maxDBF).^(16)

The factor of 5000 shouldn't make a difference, but I've left it here

source
consolidate_paths( path_list )

Extracts the unique nodes and edges among all paths in the list

source
create_node_lookup( points )

Abstractly represent the points as a volume by means of linear indexes into a sparse vector.

source

Parameters:

point_array: a Nx3 array recording all the voxel coordinates inside the object.
path_nodes: the indexes of nodeNet voxels in the point_array 
path_edges: the index pairs of nodeNet in the point_array

In the pathnodes, the nodes were encoded as the index of the pointarray. Since we do not need pointarray anymore, which could be pretty big, we'll only reserve the nodeNet coordinates and let the gc release the pointarray. the same applys to path_edges

source
find_new_root_node( points )

Extracts the point with the lowest linear index from the Array of passed points

source
find_new_root_node( dbf )

Extracts the point with the largest dbf from the Array of passed points

source
get_connectivity_matrix(edges::Vector)

construct sparse connectivity matrix accordint to the edges

source

get binary buffer formatted as neuroglancer nodeNet.

Binary format

UInt32: number of vertex
UInt32: number of edges
Array{Float32,2}: Nx3 array, xyz coordinates of vertex
Array{UInt32,2}: Mx2 arrray, node index pair of edges

reference: https://github.com/seung-lab/neuroglancer/wiki/Skeletons

source
get_segment_point_num(self::NodeNet)

get number of branching points

source

assume that the graph is acyclic, no loop.

source
get_sholl_number(self::NodeNet, radius::AbstractFloat)

get the number of points which is in neurite and incounters with a sphere centered on root node

source
local_max_multiplicative_penalty( weights, dbf, G )

Returns a version of the edge weights, modified by the DBF-based penalty

w = w * (1 - DBFdest/DBFstar)

Where DBFstar is the maximum DBF value of an outwards neighbor for each node

source
make_neighbor_graph( points )

Creates a LightGraphs graph from a point cloud, such that each point has a node (indexed in the same order as the rows), and each node is connected to its neighbors (by 26-connectivity).

Also returns a sparse matrix of weights for these edges equal to the euclidean distance between the indices of each point. These can be weighted or modified easily upon return.

source
nodes_within_radius( sub, sub2node, r, max_dims )

Identifies the node indices within the subscript

source
remove_path_from_rns!( reachableNodeList, path, points, sub2node, dbf, max_dims, scale_param, const_param)

Identifies the nodes to remove from the reachable nodes within the graph. Probably the ugliest function here, and should be optimized later

TO OPTIMIZE

source
save(self::NodeNet, cellId::UInt32, d_json::Associative, d_bin::Associative)

save nodeNet in google cloud storage for neuroglancer visualization the format is the same with meshes

source

save binary file of point pairs used in neuroglancer python interface to visualize the nodeNet

source
translate_to_origin!( points )

Normalize the point dimensions by subtracting the min across each dimension. This step isn't extremely necessary, but might be useful for compatibility with the MATLAB code. record the offset and add it back after building the nodeNet

source
compute_DBF( pointCloud )

Returns an array of DBF values for the point cloud. Currently creates a binary image, and runs bwd2 on it, though ideally we'd get rid of the need for an explicit bin_im

source

compute Distance from Boundary Field (DBF) based on point cloud and the boundary points

WARN: this function do not work correctly!

source
compute_DBF( bin_im )
source

use segmentation to get binary image to save memory usage

source
create_binary_image( pointCloud )

Creates a boolean volume where the non-segment indices map to true, while the segment indices map to false.

source
create_binary_image( seg, obj_id )

Creates a boolean volume where the non-segment indices map to true, while the segment indices map to false

source
distance_transform( d::AbstractArray{T,N}, voxelSize::Vector{Float32}=ones(Float32, N) )

Returns a euclidean distance transformation of the mask provided by d. The return value will be a volume of the same size as d where the value at each index corresponds to the distance between that location and the nearest location for which d > 0.

source
extract_dbf_values( dbf_image, pointCloud )

Takes an array where rows indicate subscripts, and extracts the values within a volume at those subscripts (in row order)

source

Fills an n-dimensional volume with initial states for edt transformation, inf for non-feature voxels, and 0 for feature voxels

source

Getting too tired to document these next few, but will be worth it if it works

source
remove_euclidean_distance_transform
source

Performs the edt over a specific row in the volume, following the first dimension

source

Performs the edt transformation along the first dimension of the N-dimensional volume

source