This package lets you write complicated formulae in index notation, which are turned into Julia's usual broadcasting, permuting, slicing, and reducing operations. It does little you couldn't do yourself, but provides a notation in which it is often easier to confirm that you are doing what you intend.
Source, issues, etc: github.com/mcabbott/TensorCast.jl
Version 0.2 was a re-write, see the release notes to know what changed.
Version 0.4 has significant changes:
- Broadcasting options and index ranges are now written
@cast @avx A[i,j] := B[i⊗j] (i ∈ 1:3)instead of
@cast A[i,j] := B[i⊗j] i:3, axv(using LoopVectorization.jl for the broadcast, and supplying the range of
- To return an array without naming it, write an underscore
@cast _[i] := ...rather than omitting it entirely.
- Some fairly obscure features have been removed for simplicity: Indexing by an array
@cast A[i,k] := B[i,J[k]]and by a range
@cast C[i] := f(D[1:3, i])will no longer work.
- Some dimension checks are inserted by default; previously the option
- It uses LazyStack.jl to combine handles slices, simplifying earlier code. This is lazier by default, write
@cast A[i,k] := log(B[k][i]) lazy=false(with a new keyword option) to glue into an
- It uses TransmuteDims.jl to handle all permutations & many reshapes. This is lazier by default – the earlier code sometimes copied to avoid reshaping a
PermutedDimsArray. This isn't always faster, though, and can be disabled by
New features in 0.4:
- Indices can appear ouside of indexing:
@cast A[i,j] = i+jtranslates to
A .= axes(A,1) .+ axes(A,2)'
- The ternary operator
? :can appear on the right, and will be broadcast correctly.
- All operations should now support OffsetArrays.jl.
- Use of
@castfor broadcasting, dealing with arrays of arrays, and generalising
@matmul, for taking the sum (or the
maximum, etc) over some dimensions
- Options: broadcasting with Strided.jl, LoopVectorization.jl, LazyArrays.jl, and slicing with StaticArrays.jl
- Docstrings, which list the complete set of possibilities.