cast - Being hyper-aware of casting is important if you are concerned about efficient processing of big data. This can be a easy source of discrepancies across different implementations of some common algorithm.
strict typing - Some languages have loose typing and are happy to "cast" variables behind the scenes. But this can be risky...
tolerance
precision - there is a finite level of precision with which your programming language represents floating point numbers (e.g. double precision means 8 bytes allocated to a number (64-bit); single precision means 4 bytes (32-bit))
bug
dependencies - Your code might depend on libraries and/or other functions, etc.
Safe programming:
input checking
unit tests - Explicit tests that you set up to check functionality of some code with respect to some known results (ground truth). Or, at least, check that the code runs successfully on some sample data. Another example: maybe your code can be checked by implementing a function and its inverse (and confirm that you get what you started with).
cast explicitly to ensure your code operates as expected
assert
corner cases
seed - if any part of your code is "random", consider controlling your random number generation seed.
Coding practices:
hard-code - in general, you want to avoid hard coding to ensure maximal generality of your code
refactor - changing or cleaning up the design and structure of functions/dependencies, etc., but not necessarily changing the actual functionality. One example is making your code more modular (more smaller functions). Often, you need to refactor if you want to maintain or more things consistent across different developers, etc.
one-liner - stylistic decision. sometimes you want to stick a bunch of "boring" stuff into one gigantic line.
bookkeeping - sometimes, it's useful to segregate your code into general sections, like "bookkeeping" vs. "figure generation" vs. "hardcore math".
Advanced strategies:
precompute
cache
CPU (time)
RAM (memory space)
overhead - there are different ways to code something, and maybe you want to optimize for minimal CPU time and/or minimal RAM usage. however, certain approaches incur overhead, and you have to carefully consider this when choosing design paths.
preallocation - increasing the size of variables is usually costly, so consider pre-allocating.
vectorization - avoiding a for-loop and operating on large matrices
broadcasting - automatically expanding/repeating a scalar (or vector) as necessary to perform mathematical operations
parallelization
profiling
memory management - when dealing with large data, you may wish to carefully consider (1) choice of data types and (2) performing careful variable management (e.g. the "workspace")