Using __builtin_c{l,t}zl fails on 64b systems as unsigned long is
8B there. However, unsigned int is only guaranteed to be 2B. Introduce
a check that should be optimized away by the compiler.
Change-Id: I854d0817c6bb5ae13c257241240664bf8f1a7c8a
With AArch64 the enumerations with values in the range
0x80000000 to 0x80000007 are being assigned to ssize_t
typed variables which are 64-bit rather than 32-bit, and
are being used in conditions where they are being checked
for being negative. This is not the case when ssize_t is
64-bit, so redefine this to INT32_MIN value.
Change-Id: I7a031a940a28658b3bf34bebac93dfb3ba397b05
Signed-off-by: Marcus Oakland <marcus.oakland@arm.com>
Signed-off-by: Ashok Bhat <ashok.bhat@arm.com>
This includes removing the map_info.c source and replacing it with the
BacktraceMap class to handle all map related code.
Change all callers of libbacktrace map functionality.
Also modify the corkscrew thread code so that it doesn't need to build
the map twice (once in the corkscrew format and once in the libbacktrace
format).
Change-Id: I32865a39f83a3dd6f958fc03c2759ba47d12382e
Use a bit better name for this. The other name was a bit confusing.
Change-Id: I1261f2ee3854a9c8b82133ad0bfbbbe48b43c9ac
(cherry picked from commit 242b1a8c7a)
Conflicts:
libbacktrace/Backtrace.cpp
Fix a small bug in the Printer for strings that didn't properly
prepend the prefix.
(cherry picked from commit 9b0e074c6d)
Change-Id: I78bfa3f76864c34f33fb439bf20dfc85616f1077
This was copied from libcore/include quite a while ago, but the
canonical version has since moved out to a generic library called
libnativehelper. All users of this header should already have
libnativehelper on their include path, so switching to the canonical
version is as easy as removing the "utils/" part.
Change-Id: Iae8e59bf3eee573bfa78381866989934e5bbf19d
Making an object Flattenable doesn't force it to
become virtual anymore. For instance, Fence and GraphicBuffer
are now non-virtual classes.
Also change Flatennable protocol a bit so that it updates
its parameters (pointers, sizes) to make it easier
to implement a flattenable in terms of other flattenables.
Change-Id: Ie81dc7637180b3c2cfcbaf644f8987ca804eb891
when libutils is statically linked, the ordering of the static
initializer is not guaranteed and therefore it's unsafe to use
empty static strings: e.g.:
static String8 sThisStaticStringIsNotSafe;
instead, this new constructor can be used:
static String8 sThisStaticStringIsSafe(kEmptyString);
Change-Id: Ia3daf1cab1c97d021c0ee9c2b394b5e27e8d6c0d
This is just to support the watchdog to give it a faster
way to determine if a thread is deadlocked without having
to post a message to it.
Change-Id: I068dc8b9387caf94fe5811fb4aeb0f9b57b1a080
- added a ctor that updates and dumps the stack immediately
- added a "logtag" parameter to dump()
Change-Id: Ie51c256071d282591752243bdb4f68cf9ff8829d
background:
we have some code to fix-up the IDs of references when
using RefBase's DEBUG_REFS when those refs are managed by
arrays wp<> or sp<> (this is because wp<> / sp<> don't have
a trivial ctor when DEBUG_REFS is enabled, and Vector
treats them as trivial for obvious performance reasons)
this is complicated by the fact that we don't want to have
to recompile everything when enabling DEBUG_REFs (i.e.: the
Vector code cannot know wheter it's enabled or not for its
template stuff).
problem:
there was a bug in the fix-up code for wp<> which was trying
to access the weakref_impl from the RefBase* however, this was
moronic since RefBase could have been destroyed if there wasn't
any more strong refs -- and this happned. Instead we need to get
the weakref_impl directly from the wp<>
Change-Id: Ie16e334204205fdbff142acb9faff8479a78450b